First off, does the website you are targeting support basic authentication? If so, you can use cURL's -u option (I mean the "real" curl, not the alias in PowerShell.) You don't need an external tool like cURL or wget, PowerShell has the Invoke-WebRequest commandlet (aliased to "curl") which does everything cURL does. However, to use basic auth with PowerShell, you need to handle it differently. You can google "basic authentication with invoke-webrequest" to get a bunch of examples, like this one.Dear Team, I am trying to set up a Rainmeter skin based on scraping the status page of my PV system:
- In a regular browser, I simply open this page: https://enlighten.enphaseenergy.com/web/5000000/today/graph/hours
- This opens a login screen, I enter my username/password and then the PV status page is shown under the same exact url
- My goal is to download the status page html (behind the login-page) in the background with curl or wget in order to parse it with Rainmeter
Now the big question: Does anybody know how to pass the login credentials for this page with curl or wget? Looks like a JavaScript login procedure is used: just my email and my password are enough to display the data I want to scrape... How can this be simulated with a command line html download tool, if not curl/wget, I'm open to any solution which circumvents automating a heavy standard chrome browser window.
Plan B: Can you maybe recommend another forum with "experts" for this curl/wget specific question?
Note: Using "Remember me" at login is not my target - this saves a cookie and then I don't need to enter my login credentials for a few weeks, but my multiple attempts to pass the extracted cookie to curl/wget have failed, so now I'm hoping to go the "simple way" with the above idea.
If your site doesn't support basic auth, you'll need to figure out what it does support.
Statistics: Posted by SilverAzide — Yesterday, 3:17 pm