Jump to content

DLO

New Members
  • Posts

    2
  • Joined

  • Last visited

DLO's Achievements

Newbie

Newbie (1/14)

0

Reputation

  1. I am trying to use JSONReader (along with PHP & XPATH) to parse a very large JSON file, then display search results. A stream parser (such as JSONREader) is recommended over JSON_decode when parsing large files. This simple code below is not displaying any results (in the echo statements). Any advice is greatly appreciated. $reader = new JSONReader(); $reader->open('products.json'); $dom = new DOMDocument; $xpath = new DOMXpath($dom); while ($reader->read() && $reader->name !== 'product') { continue; } while ($reader->name === 'product') { $node = $dom->importNode($reader->expand(), TRUE); $name = $xpath->evaluate('string(name)', $node); $price = $xpath->evaluate('string(price)', $node); echo "Name: " . $name . ". "; echo "Price: " . $price . ". "; $reader->next('product'); } Here is a snippet of the JSON file: { "products": { "product" : [ { "name" : "Dell 409", "price" : 499.99}, { "name" : "HP Lap top", "price" : 599.99}, { "name" : "Compaq 11", "price" : 299.99} ] }}
  2. I am working on a project where keywords (submitted from a website) will query a large XML document for matching criteria (for shopping products), then retrieve and return all the relevant search results (much like Google Shopping). There may be 100’s of search results returned, with 20 search results per page. Using PHP for the API call, would XPATH or SAX be better with regard to page loading speed and memory efficiency.
×
×
  • Create New...