ETLs with Java and Hive queries to Spark ETLs, MapReduce, Spark

Закрито Опубліковано %project.relative_time Оплачується при отриманні
Закрито Оплачується при отриманні

Hi all,

Looking for support on below skill set

Transition of legacy ETLs with Java and Hive queries to Spark ETLs.

Design and develop data processing solutions and custom ETL pipelines for varied data formats like parquet and Avro.

Design, develop, test and release ETL mappings, mapplets, workflows using Streamsets, Java MapReduce, Spark and SQL.

Let me know if you have experience in it

Java Spark Informatica Powercenter ETL Amazon Web Services Архітектура ПЗ

ID Проекту: #32188044

Про проект

4 заявок(-ки) Дистанційний проект Остання активність 2 роки(ів) тому

4 фрілансерів(-и) готові виконати цю роботу у середньому за ₹7625

manojmo

hi, i have 20+ years experience working with java, web and database tech. I have worked a bit on spark, cresting some reports. I have also build custom ETL jobs and using some like Talend. I have working knowledge, but Більше

₹7000 INR за 7 дні(-в)
(4 відгуків(и))
4.5
hjdsolution

Hello, I am having 9+ years of working experience and certified bigdata developer. Please message me here to discuss further project requirements I will complete your work with reasonable fees Best Regards, Jay

₹5000 INR за 7 дні(-в)
(6 відгуків(и))
4.6
shahparam

Hi I have good experience with Spark based Data processing & ETL applications on local, hadoop & AWS. I prefer to use scala but happy to support your requirements.

₹6000 INR за 2 дні(-в)
(4 відгуків(и))
2.1
roshanr1993

Hi, I have 6 years if experience in the skill set you have mentioned and working as senior bigdata engineer. I think you can let me know about the requirement and we can discuss more during call. Thanks

₹12500 INR за 7 дні(-в)
(1 відгук)
0.8