开发者

Getting an image with PHP

Is it bad practise to retrieve images this way? I have a page to call this script about 100 times (there are 100 images). Can i cause server overload or too many http requests or something? I have problems with the server and i dont know if this is causing it :(

// SET THE CONTENT TYPE HEADER
header('Content-type: image/jpeg');

// GET THE IMAGE TO DISPLAY
$image = imagecreatefromjpeg( '../path/to/image/' . $_SESSION[ID] . '/thumbnail/' . $_GET[image]);

// OUTPUT IMAGE AND FREE MEMORY
im开发者_开发问答agejpeg($image);
imagedestroy($image);  

I call the script from regular tags. The reason I call them through PHP is that the images are private to the user.

All help greatly appreciated!!


With this, you are :

  • Reading the content of a file
  • Evaluating that content to an in-memory image
  • Re-rendering that image


If you just want to send an image (that you have on disk) to your users, why not just use readfile(), like this :

header('Content-type: image/jpeg');
readfile('../path/to/image/' . $_SESSION[ID] . '/thumbnail/' . $_GET[image]);

With that, you'll just :

  • Read the file
  • and send its content

Without evaluating it to an image -- eliminating some useless computations in the process.


As a sidenote : you should not use $_GET[image] like that in your path : you must make sure no malicious data is injected via that parameter !

Else, anyone will potentially be able to access any possible file on your server... they just have to specify some relative path in the image parameter...


Yes, it's very bad. You're decoding a .jpg into a memory-based bitmap (which is "huge" compared to the original binary .jpg. You then recompress the bitmap into a jpeg.

So you're wasting a ton of

a) memory
b) CPU time
c) losing even more image quality, because jpg is a lossy format.

why not just do:

<?php
header('Content-type: text/jpeg');
readfile('/path/to/your/image.jpg');

instead?


To answer two particular questions from your question

Can i cause server overload or too many http requests or something?

yes, of course.
by both numerous HTTP requests and image processing.
You have to reduce number of images and implement some pagination to show images in smaller packs.
You may also implement some Conditional GET functionality to reduce bandwidth and load.
If things continue getting bad, and you have some resources to dispose, consider to install some content distribution proxy. nginx with X-Accel-Redirect header is a common example

I have problems with the server and i dont know if this is causing it :(

You shouldn't shoot in the dark then. Profile your site first.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜