ETLs with Java and Hive queries to Spark ETLs, MapReduce, Spark
₹1500-12500 INR
Оплачується при отриманні
Hi all,
Looking for support on below skill set
Transition of legacy ETLs with Java and Hive queries to Spark ETLs.
Design and develop data processing solutions and custom ETL pipelines for varied data formats like parquet and Avro.
Design, develop, test and release ETL mappings, mapplets, workflows using Streamsets, Java MapReduce, Spark and SQL.
Let me know if you have experience in it
ID Проекту: #32188044
Про проект
4 фрілансерів(-и) готові виконати цю роботу у середньому за ₹7625
Hello, I am having 9+ years of working experience and certified bigdata developer. Please message me here to discuss further project requirements I will complete your work with reasonable fees Best Regards, Jay
Hi I have good experience with Spark based Data processing & ETL applications on local, hadoop & AWS. I prefer to use scala but happy to support your requirements.
Hi, I have 6 years if experience in the skill set you have mentioned and working as senior bigdata engineer. I think you can let me know about the requirement and we can discuss more during call. Thanks