Learn how to transfer cookies from your web browser to your crawlers. Log into websites when web scraping or automating tasks using your existing logins.
To crawl websites that require a login, you can transfer cookies from your web browser to directly into Apify actors such as Web Scraper (apify/web-scraper), Puppeteer Scraper (apify/puppeteer-scraper) and Instagram Scraper (jaroslavhejlek/instagram-scraper).
This is the quickest and simplest solution, however there are others that may be more reliable. For example, you can also fill in the login form directly in the code.
First, install a browser extension like EditThisCookie. After installation, go to the website you'd like to crawl and log in using your credentials.
Click the EditThisCookie button next to your URL and click Export. Cookies will be copied to your clipboard as a JSON array, which is compatible with the cookie format used by Puppeteer/Headless Chrome (the headless browser we use for crawling).
The Initial cookies field is in the Proxy and browser configuration tab in Web Scraper's Input section. Paste the cookies into the field.
And that's it! When you run the scraper, it will start already logged-in. Note that if the cookies are short-lived, this might not work and you will need to implement login in your code.