Originally Posted by EditFast
I have one page of content (a links page describing all my sites) that appears on all my sites which are all on the same server. Right now, if I make a change or add a new site, I need to edit each page on every site. I would like to be able to edit one page and have the changes appear on all sites. I assume I need some sort of script to do this but I don't know what language or even if I do need a script. Is this even possible? Any ideas of where to start... or where to start looking?
So this presumably means you have 1 server, several domains (maybe) and a number of pages on the server acting as home pages?
There's so many ways of solving this - eg the way I do it is by altering the default index of my server (by going into the httpd.conf file, which only trained specialists are allowed to do, and bold amateurs) and pointing it at a script (which happens to be in perl, but it could be any language, I'm sure).
The script then questions the server about what domain was used when the remote user asked for the page - the script has a different page to call up for each domain and does so.
But you appear to have quite a different setup - lots of duplicate pages for many domains linked in one place, whilst mine is lots of different pages all called up from a single page.
Surely if your sites share a front page it may be more efficient to just point their domains all at one front page?
Or is SOME of the page different each time?
SSI is the second best way to do the job - but the reason there is a better way is simply that SSI is a bit like having a mobile phone plugged in all the time - it's perhaps a slightly unnecessary use of resources... when you switch things into executable status from the back of a server you start to see that the more there is that it executable or executed, the more stress the server has to cope with every day... it'll go bald faster.
The best solution is to set up a little database with the "changing" content, and one or two shell scripts called by the crontab each day or hour or whatever... every time the shell script is run (a very minor overhead of resources if it is properly written***) it can test for changed content or new content in the database and if the conditions are met, it can reprint the entire html files for your front pages, all as new html files, completely flat with no SSI no cgi no nuffink.
It IS the best way and the way I would and do use for such problems. It is the way which is best for the machine, in short - the programmer's way.
To do all this you need...
1. A Linux server
2. To start to play with crontab
3. To learn some basic perl (open file, write to file, read from file... that sort of EEEASY stuff)
4. Hours and hours to spend having so much fun, messing about in servers.
*** even if the crontab runs this script once an hour, that's still only 24 times per day - an index page with SSI on it will "run" its executable half EVERY TIME THE PAGE IS ACCESSED... so if you want 1 million usages a day, that's 1 million executions... my method would take me till I was VERY old before I'd made my machine do the same amount of processing
Just my 28 pesos