Cloud setup for Python 2.7 and Python 3.7 scrapers

Закрито Опубліковано %project.relative_time Оплачується при отриманні
Закрито Оплачується при отриманні

I need to have our scrapers deployed to the cloud.

I am scraping data from a few sites and storing it in a MySQL database. I already have all the scrapers and the database.

I have the following 6 scripts:

1. Scraper for ksl .com/cars - BeautifulSoup - Python 3.7

2. Scraper for [login to view URL] .com - BeautifulSoup - Python 3.7

3. Scraper for craigslist - Scrapy - Python 2.7

4. Scraper for autotrader - Scrapy - Python 2.7

5. Script for eBay Motors API - Python 3.7

6. [login to view URL] - Python 2.7 - calls stored procedures and sends their results via email using SendGrid

(I already have tried setting things up on Google Cloud Platform using Google Compute Engine and Google Cloud SQL, but it isn't working very well).

Our database is MySQL 5.7 2nd Gen InnoDB.

The setup needs to have all 5 scripts constantly running (restart automatically upon finish).

The scripts need to be dormant (and not run) from 12:00 AM - 6:00 AM (GMT - 7:00).

I am open to using basically any cloud provider (Google, AWS, Azure, Digital Ocean, Heroku, etc.).

I am open to using docker containers.

The solution needs to support my running and testing it locally relatively easily.

The solution needs to support my pushing code changes to it relatively easily.

In your proposal, tell me:

- Exactly how you would architect the entire solution

- Why your solution makes sense / is the best

- How much experience you have with the technologies you'll use

- How many days it will take you to build it

If I award you the project, then I expect you to complete it in the timeframe you quote.

The setup will need to be well documented.

All of your code needs to be well commented.

Your code needs to have good error handling.

We will manage all code via GitHub.

At the end, you will need to walk me through the setup you build and show me how to use it.

Please feel free to ask any questions you need!

Python Веб-скрапінг Архітектура ПЗ MySQL Хмарні обчислення

ID Проекту: #20782097

Про проект

8 заявок(-ки) Дистанційний проект Остання активність 4 роки(ів) тому

8 фрілансерів(-и) готові виконати цю роботу у середньому за $57

joystick220

- Architecture Compute: AWS Fargate to run your scrapers in containerised environment which gives you the option to run during certain time window. Database: AWS RDS Documentation and Resource orchestration: AWS Cloudf Більше

$20 USD за 7 дні(-в)
(34 відгуків(и))
6.2
ferozstk

Hello, After reading your project details I believe I'm suitable for this project. As I'm expert on it with more than 7 years experience. Please feel free to contact me. I am looking forward to hear from you. Більше

$30 USD за 1 день
(51 відгуків(и))
5.9
sepehrbg

Hi sir, i am experienced in scraping, i did many similar jobs which you can see in my reviews, i have developed some codes already so the speed will be high for your job, would you please share the details?

$10 USD за 7 дні(-в)
(34 відгуків(и))
5.2
bluelagon

Hi, lets get it done. I have similar experience.

$30 USD за 2 дні(-в)
(22 відгуків(и))
4.5
frire

hi we can use AWS ec2 or any other vm provider like gcp. i am suggesting aws ec2 as inbound costs are 0 and so no extra charges will be incurring. i've deployed multiple apps on aws so i can say this can be done very Більше

$30 USD за 7 дні(-в)
(6 відгуків(и))
4.4
MrMenezes

Would use GClound + Kubernetes Host one server for each script separately (Ensuring isolation) Host a server to control and configure the schedule of each Scritp (Generating only one access point) Host a Kubedb with Більше

$300 USD за 15 дні(-в)
(0 відгуків(и))
0.0