I am creating an application which browses a large amount of pictures. At this point, that portion of the project is done and it sorts, filters, and loads the correct pictures, and even splits them into separate pages for faster loading.
This works great, but it still takes over 8 seconds to load the 25 pictures per page. I have done some research and I have concluded that using asynchronous jQuery Ajax requests would be the best to load them all at the same time as fast as possible.
Here is my code for this so far:
var imageArray = <?php if (!empty($images)) {echo '["' . implode('", "', $images) . '"]';} else {echo "false";} ?>; console.log(imageArray); console.log(imageArray.length); for (i = 0; i < imageArray.length; i++) { $.ajax({ type: 'GET', url: imageArray[i], dataType: 'image/jpg', async: true, success: function (data) { $("#" + i).attr("src", data); } }); }
The problem with this code is that it doesn’t load nothing but an empty white square with a grey border. When I modify the code and run it in console in Chrome, data
ends up being a string of jumbled up characters which I presume are the raw image data.
I have been searching for several days now, including on SO, and I have yet to find a solution which does what I need. On the contrary, I have found solutions which simply put the url into the image source using jQuery attr()
which isn’t what I need.
If anyone can provide any kind of solution to fix this code, or even perhaps a different and more efficient method of getting all the images, I am open to anything.
imageArray: http://pastebin.com/03tvpNey
Regards, Emanuel
Advertisement
Answer
If you’re using Base64 image data (the string of jumbled characters you mentioned), you’ll need to get your img src
to be something like:
<img src="data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAAUAAAAFCAYAAACNbyblAAAAHElEQVQI12P4//8/w38GIAXDIBKE0DHxgljNBAAO9TXL0Y4OHwAAAABJRU5ErkJggg==" alt="Red dot" />
So your code should be:
var imageArray = <?php if (!empty($images)) {echo '["' . implode('", "', $images) . '"]';} else {echo "false";} ?>; console.log(imageArray); console.log(imageArray.length); for (i = 0; i < imageArray.length; i++) { $.ajax({ type: 'GET', url: imageArray[i], dataType: 'image/jpg', async: true, success: function (data) { $("#" + i).attr("src", 'data:image/png;base64,'+data); } }); }
However… I’d be very surprised if loading the images over AJAX and using the Base64 content is any quicker than using a normal <img src="path"/>
approach.
The point of AJAX is to fetch some data after the DOM is loaded. All moderns browsers will already fetch images asynchronously. I doubt you’ll see any performance gain whatsoever.
I’d suggest the more likely problem is that your 25 images… which I suppose are displayed as thumbnails… are still large, hi-res images. You should save a smaller ‘thumbnail’ version, and then fetch the hi-res image when/if the user clicks on the thumbnail to view the full-size one.
Note that an element ID can start with (or be) a number. This wasn’t valid in HTML4, but is perfectly fine in HTML5. However, you might have trouble with CSS rules – so you’d be better off prefixing with a letter (i.e. i1,i2,i3…).