Learn to crawl and scrape data from URLs specified in a spreadsheet with Apify scrapers. Scrape a pre-determined list of web pages with Apify actors.
These actors start with a pre-defined list of URLs (start URLs), then recursively follow links to find new pages (optional).
You don't have to add them to the actor manually or export them as a file, only to upload to the scraper.
Simply add the
/gviz/tq?tqx=out:csv query parameter to the base part of the Google Sheet URL, right after the long document identifier.
This gives you a URL that automatically exports the spreadsheet to CSV. Then, just click the Link remote text file button in the actor's input and paste the URL.
IMPORTANT: Make sure the document can be viewed by anyone with the link, otherwise the actor will not be able to access it.
And that's it, now the actor will download the content of the spreadsheet with up-to-date URLs whenever it starts.
Beware that the spreadsheet should have a simple structure, so the actor can easily find the URLs in it. Also, it should only have one sheet.