Hi, ive created script which will gather data from some files and then post the data to a website. I have done the part where I get all the info I want from the files but unsure about how to get this on to my proposed website.
Im not sure where to use CGI for the website or something else. Can anyone suggest what is best? and also
a quick and easy way of doing this?
Should this be done dynamically, or would it be sufficient to run the script every so often in the background (maybe as a cron job), and then have it copy static HTML pages that it generates to a public web directory?
If the former, then you'd need something like Apache with mod_perl to run the script on demand.
Ping & DNS - updated with new look and Ping home screen widget
Joined: Jul 25, 2008
The script would run daily via a cron job yes. Then the stats it gets from the files I would like to put on a website.
So is CGI ok for this? or should I be using something else?
Joined: Mar 22, 2005
CGI is for building dynamic sites. If the script generates static HTML pages, then there is no need for it.