PHP: Query Exceeding Max Time (How to handle this)
Ok. I have my query written finally for my php file (which will write the query to a txt file). However, when I was testing the file earlier I apparently am excending the maximum execution time. I was using just one large query to do the work, but now I am wondering if it would be best to use multiple smaller queries. Would this cut down on the execution time?
If it would, then I have another question for all of you, how would I go about this? This is the bulk of my file:
// Query the database for data
$query = "SELECT
cards.card_id,
concat(cards.title, ' By Amy') AS cardTitle,
cards.meta_description,
cards.description,
'' as Weight,
'' as UPC,
cards.seo_keywords,
concat('http://www.amyadele.com/attachments//cards/',cards.card_id,'/',cards.card_image) AS ImageURL,
card_import.artist,
concat('ARTIST - ', card_import.artist) AS Brand,
min(card_lookup_values.card_price) AS LowPrice,
replace(lower(concat( 'http://www.amyadele.com/', pcat.seoname,'/',cat.seoname, '/', cards.seoname, '.htm' )),' ','+') AS link,
concat(pcat.name,'>',cat.name) a开发者_运维技巧s Merchant,
round(min(card_lookup_values.card_price), 2) AS newPrice,
min(cast(lookup_details.value as signed)) as quantity
FROM
cards
INNER JOIN card_cheapest on cards.card_id = card_cheapest.card_id
LEFT JOIN card_import on card_import.card_id = cards.card_id
INNER JOIN card_lookup_values on card_lookup_values.card_id = cards.card_id
INNER JOIN card_categories cc ON cards.card_id = cc.card_id AND cards.card_live = 'y' AND cards.active = 'y' AND cc.active = 'Y'
INNER JOIN categories cat ON cat.category_id = cc.category_id AND cat.active = 'Y'
INNER JOIN categories pcat ON cat.parent_category_id = pcat.category_id
INNER JOIN card_lookup_values as card_values ON cards.card_id = card_values.card_id
INNER JOIN lookup_details ON card_lookup_values.lookup_detail_id = lookup_details.lookup_detail_id
WHERE card_lookup_values.lookup_id = 7
GROUP BY
cards.card_id
ORDER BY
cards.card_id";
$result = mysql_query($query);
// Open file for writing
$myFile = "google.txt";
$fh = fopen($myFile, 'w') or die("can't open file");
// Loop through returned data and write (append) directly to file
fprintf($fh, "%-25s %-200 \n", "ID", "Name");
fprintf($fh, "\n");
while ($row = mysql_fetch_assoc($result)) {
fprintf($fh, "%-25s %-200s \n", $row['card_id'], $row['cardTitle']);
}
// Close out the file
fclose($fh);
echo "The file has been written sucessfully to googleproducts.txt. It will run again tomorrow at 12:00pm."
?>
As you can see I am pulling in $query to $results and then using that to insert it into $row. I guess my actual question (if possible), is how do I go about having $query, and $query2 and inserting $query2 into the same while statement?
Multiple queries, if anything, will just slow things down. It means compiling/executing multiple queries, so you end up with more overhead. Use explain
on your query to see what indexes are (and aren't) being used, and add more indexing where required. And in some cases, things will just naturally be slow. You are, after all, doing a 8-way join on 7 tables.
Obvious places to check for index usage: each field in the where
and join
clauses.
you can change the max_execution_time directive in you php ini
or you can add this to the top of your script
set_time_limit ( 0 );
zero is unlimited
精彩评论