Win a copy of Think Java: How to Think Like a Computer Scientist this week in the Java in General forum!
  • Post Reply
  • Bookmark Topic Watch Topic
  • New Topic

URL processing

 
Sunder Ganapathy
Ranch Hand
Posts: 120
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
An http page ( say A ) contains many links .
Is it possible to take one link after another
in A, and do processing as follows.
Get the next link in A . Let us say it is B.
If B is having many links, can we open each
link in B and write them to a separate file.

A bird's eye view would be very much helpful.
 
Ben Souther
Sheriff
Posts: 13411
Firefox Browser Redhat VI Editor
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
The program 'wget' will spider through all of the links on a page and download each.
It's the program that is used to for mirror sites to update themselves.
I've never seen a Java version of it.
You might want to take a look at http://jakarta.apache.org/commons/httpclient.
 
  • Post Reply
  • Bookmark Topic Watch Topic
  • New Topic