Published November 30, 2020 | Version v1
Journal article Open

Sqoop usage in Hadoop Distributed File System and Observations to Handle Common Errors

  • 1. Assoicate professor CS IT, Jain Univrsity, Bangalore,
  • 2. Professor and Academic Head, School of CS and IT, Jain University,
  • 3. Professor CSE, Narasaraopeta Engineering College (Autonomous), Narasaraopet, A.P
  • 1. Publisher


The Hadoop framework provides a way of storing and processing the huge amounts of the data. The social media like Facebook, twitter and amazon uses Hadoop eco system tools so as to store the data in Hadoop distributed file system and to process the data Map Reduce (MR). The current work describes the usage of Sqoop in the process of import and export with HDFS. The work involves various possible import/export commands supported by the tool Sqoop in the eco system of Hadoop. The importance of the work is to highlight the common errors while installing Sqoop and working with Sqoop. Many developers and researchers were using Sqoop so as to perform the import/export process and to handle the source data in the relational format. In the current work the connectivity between mysql and sqoop were presented and various commands usage along with the results were presented. The outcome of the work is for each command the possible errors encountered and the corresponding solution is mentioned. The common configuration settings we have to follow so as to handle the Sqoop without any errors is also mentioned.



Files (243.5 kB)

Name Size Download all
243.5 kB Preview Download

Additional details

Related works

Is cited by
Journal article: 2277-3878 (ISSN)


Retrieval Number