Quantcast
Channel: The Official Rackspace Blog Hong Kong
Viewing all articles
Browse latest Browse all 10

Flume and Sqoop: Getting Data into Hadoop [Webinar]

$
0
0

Data is woven into every sector of the global economy and it’s a huge competitive advantage when put to use in smart ways. From improved operational efficiencies, expanded customer intelligence and insights on new business opportunities, Big Data isn’t going away anytime soon. Every day 2.5 billion gigabytes of data are generated from different sources. That’s a lot of data to put to use!

If you haven’t heard of Hadoop, now is the time to learn because it’s designed to store and process large data sets. If writing scripts to load data into a system from external sources sounds like a daunting task, fear not. Let me introduce you to powerful tools that are born to help move data into Hadoop from your existing systems: Flume and Sqoop.

 

  • Flume is a tool for streaming and moving large amounts of data from different sources into your Hadoop environment.
    • Use it when: moving newly generated log data into HDFS.
  • Sqoop is another tool that allows you to move data into relational databases (RDMBS) into Hadoop.
    • Use it when: moving historical data in transactional databases into Hadoop for post-processing.

Ready to make working with big data a smaller job? Join us for a webinar on 28 July, “Getting Data into Hadoop”, where I will introduce the use cases and architecture of Flume and Sqoop, and then demonstrate their power in live demos. Register now to secure your seat.

If you missed the previous four webinars in our Big Data series, you can check them out at our Youtube channel:

Want to know more about the magic power of Big Data? Click here to have an in-depth overview.


Viewing all articles
Browse latest Browse all 10

Latest Images

Trending Articles



Latest Images