I have a 900+ row MySQL database, comprising 15 categories of site_name and URL details, dynamically accessed using a series of selects. These pages retrieved by these URLs are copied into the main contents frame of my home page. I am receiving a healthy number of hits each day but am unable to determine which links are most/least popular.
I have hit the problem of storing the contents of parent.contents.location.href for further use I would like to either write them to a text file or directly back into MySQL for analysis purposes.
The select code follows:
(The $a is passed into the function as the value of site_code, which is not retrieved)
$result_recordset=$db->Execute("select distinct site_name,site_link
and site_name is not null
and active = 'Y'
order by site_name");
echo("<option value ='".$result_recordset->fields['site_link']."'> ".$result_recordset->fields['site_name']);
The use of a separate SUBMIT button to perform a GET or POST would not suit the architecture as it now exists.
I didn't know whether an 'intermediate page' would do the trick. I guess you mean to use it as a 'black box' and 'dump' the data there en-route to the 'proper page' that would possibly solve things. Thanks for that! I'll give it a whirl.
Well... I don't know about 'black box'; but 'dumping' the data sounds about right.
The important thing is that everything except going to the site URL will be behind the scenes; and unless there is a TON of data, or a lousy internet connection, or something bottlenecking the data, it should process invisibly to the user - from their point of view, it just goes to that site, nothing more.
Thanks for the suggestions guys. Sorry about the late reply. I think my ISP has an issue with superglobals like $_GET, having upgraded to PHP5, so I'm trying to either get them turned back on or I'll have to have a workaround.
location.href="page2.php?Result=" + Result;
to the external php.
This is new territory for me....trying to 'marry' client-side and server-side seems like a minefield!