I have a job that I need to schedule on a daily basis to grab a webpage, parse the content, follow one of the links, parse its content, and then optionally send out an email alert.

From the comfort of my own home I'm free to use lynx -dump which will strip out all the html elements and just show the text as you'd see in the browser, and that's helpful for part of my task.

However, I don't have a server at home, so I need to schedule this on my "shared linux hosting" webserver box where I only have a limited jailshell.
I don't have permission to call lynx from there.
I do have wwwget and curl but from what I can see they don't provide an equivalent command out of the box.

Is there a clever alternative that I could try?