Jump to content

duframe

New Members
  • Posts

    3
  • Joined

  • Last visited

duframe's Achievements

Newbie

Newbie (1/14)

0

Reputation

  1. Thanks for the reply. I will have 25 results per page to break it up. I really want to try and return the results using less resources. I figured that so many queries running at the same time with so many records in so many tables would really slow it down. My goal is to gauge for large amounts of traffic and have efficient code and queries to handle it fast. Do you think I am over thinking it? Perhaps my query could be optimized. SELECT table1.p_id, table1.r_name, table1.account_activated, table1.r_phone, table1.r_fax, table1.r_address_1, table1.r_address_2, table1.r_city, table1.r_postal, table1.r_state, table1.r_description, table1.r_genre_1, table1.r_genre_2, table1.r_genre_3, table1.r_genre_4, table1.r_genre_5, table1.r_date_added, table1.r_rate_store, table1.r_rate_store_total, table1.order_online, table1.online_delivery, table1.online_delivery_start, table1.online_delivery_end, table1.gift_certs, table1.RSS, table1.rewards_programs, table1.url, p_details.delivery, table1.r_lattitude, table1.r_longitude, ( 3959 * acos( cos( radians(40.745564) ) * cos( radians( r_lattitude ) ) * cos( radians( r_longitude ) - radians(-73.977736) ) + sin( radians(40.745564) ) * sin( radians( r_lattitude ) ) ) ) AS howfar FROM table1, p_details WHERE p_details.size = 1 and p_details.external = 1 and p_details.usb = 1 HAVING howfar <= 3 ORDER BY howfar would a join make this run faster?
  2. Hello All, (using php,mysql,jquery,javascript) I have a database of about 1 million products. I will be using a jquery/ajax server call to retrieve records based on a filters. scenario: Search for a hardrive within 5 miles of my house. When the results are displayed, show filters like hard drive size, external, internal, etc.. Use the jquery/ajax call to return filtered results on the fly (user doesn't leave page). Filters are toggled. A user can view the product or navigate elsewhere and return to the search results page where they left off. As of now, it will query through a million records to find all products within a given radius. The typical range of products returned are about 1000-10000. I need someway to store these records for the users session while they browsing the site. I can then query a max of 10k records to filter by rather than 1 million records. This will help reduce the strain on the system and use less resources. Problems: 1. Using a temporary table in mysql is valid only while the connection is open to the server. I need it to last as long as the user is on the site. 2. Creating a normal table to store the initial results will become too much. Too many tables to manage. There could be as many as 8000 tables at a given time for 8000 visitors and this will slow performance overall. 3. Saving the results as an xml,json would be fine. However, finding a file with up to 8000 files in a directory at a given time can increase the time it takes to find the file. Especially cumbersome when you are creating, updating, or deleting files. This would also require a cron job to manage the deletion of expired files. Creating multiple directories with less files in each may also become to much to manage. What is the best way to approach this challenge for performance?
  3. I have spent the entire day searching, trying, pulling out my hair, and I cannot yet find a solution. I am rather new to javascript, so I'm sure that my verbiage might be wrong here. I am using jquery for a window box. I am also using an XMLHttpRequest/ActiveXObject request to pull dynamic content from the db. Everything works perfectly in Firefox and other browsers except for IE (using v8). onClick of a link will call the script below and load that into a div called iteminfo. Only in IE, the iteminfo content does not load until the mouse is moved off of the screen (out of the display area of the browser) or you open another application/window and come back to IE, and at that point it will load the content into the div. I've tried setInterval and setTimeout as a delay to help give IE time to download the dynamic content. I've tried using a loop if readyState!= 4 then loop through the function stateChanged until it's == 4, but just turns out to be an endless loop. I look forward to your suggestions. var xmlHttp; function showResult(rid, iid) { document.getElementById("iteminfo").style.display="block"; xmlHttp=GetXmlHttpObject(); if (xmlHttp==null) { alert ("Your browser does not support features used on this page. You may need to download the latest version of Javascript. You will not be able to view the pictures and descriptions for each item."); return; } var url="pathtofilegoeshere.php"; url=url+"?rid="+rid; url=url+"&iid="+iid; xmlHttp.open("GET",url,true); xmlHttp.onreadystatechange=stateChanged; xmlHttp.send(null); } function stateChanged() { if (xmlHttp.readyState==4 || xmlHttp.readyState=="complete") { document.getElementById("iteminfo").innerHTML=xmlHttp.responseText; } } function GetXmlHttpObject() { var xmlHttp=null; try { // Firefox, Opera 8.0+, Safari xmlHttp=new XMLHttpRequest(); } catch (e) { // Internet Explorer try { xmlHttp=new ActiveXObject("Msxml2.XMLHTTP"); } catch (e) { xmlHttp=new ActiveXObject("Microsoft.XMLHTTP"); } } return xmlHttp; }
×
×
  • Create New...