Importing large amounts of data directly from a CSV into your database via SQL workbench or any other visual editor can be a tedious, and very time consuming task - especially if you are connecting via ssh.
In this post I will describe how to host your docker builds in your own gitlab docker registry. In addition, whenever a tag is built for the repository, a new image will be generated and pushed to the repository.
Scrapy is a python library that allows you to scrape and interact with content on the web. However, until you get to the advanced usage it is meant to be run client side. Running scrapy inside of a docker container will allow you to run your spiders in the same environment across all instances and make it easy to import your scraped data into any project.