Who Compresses their HTML?
Even Stack Overflow doesn't compress their HTML. Is it recommended to compress HTML? As far as I've seen, it looks like Google is the only one.... (view the source). Why isn't this stand开发者_运维问答ard practice?
I think you are confusing the source code minification of HTML, and GZIP compression. The latter is quite common (for example using mod_gzip
on Apache, article here) and should be enough in most cases. It is totally internal between the server and the browser, you can't see it in the source code.
Actual minification of the HTML is not really worth doing except for sites where a saved byte can mean tens of thousands of dollars in traffic savings (like for Google.)
HTML minification apprently doesn't matter that much for Stackoverflow. I did a little test based on the HTML source of the frontpage.
Raw content length: 207454 bytes Gzipped content length: 30915 bytes Trimmed content length: 176354 bytes Trimmed and gzipped content length: 29658 bytes
SO already uses GZIP compression, so trimming whitespace (actually, HTML minification, or "HTML compression" as you call it) would save "only" around 1KB of bandwidth per response. For gigants with over 1 million pageviews per day HTML minification would already save over 1GB of bandwidth per day (actually, SO would save that much as well). Google serves billions of pageviews per day and every byte of difference would save gigabytes per day.
FWIW, I used this simple quick'n'dirty Java application to test it:
package com.stackoverflow.q2424952;
import java.io.BufferedReader;
import java.io.ByteArrayInputStream;
import java.io.ByteArrayOutputStream;
import java.io.IOException;
import java.io.InputStream;
import java.io.InputStreamReader;
import java.net.URL;
import java.util.zip.GZIPOutputStream;
public class Test {
public static void main(String... args) throws IOException {
InputStream input = new URL("http://stackoverflow.com").openStream();
byte[] raw = raw(input);
System.out.println("Raw content length: " + raw.length + " bytes");
byte[] gzipped = gzip(new ByteArrayInputStream(raw));
System.out.println("Gzipped content length: " + gzipped.length + " bytes");
byte[] trimmed = trim(new ByteArrayInputStream(raw));
System.out.println("Trimmed content length: " + trimmed.length + " bytes");
byte[] trimmedAndGzipped = gzip(new ByteArrayInputStream(trimmed));
System.out.println("Trimmed and gzipped content length: " + trimmedAndGzipped.length + " bytes");
}
public static byte[] raw(InputStream input) throws IOException {
ByteArrayOutputStream output = new ByteArrayOutputStream();
for (int data; (data = input.read()) != -1; output.write(data));
input.close(); output.close(); return output.toByteArray();
}
public static byte[] gzip(InputStream input) throws IOException {
ByteArrayOutputStream output = new ByteArrayOutputStream();
GZIPOutputStream gzip = new GZIPOutputStream(output);
for (int data; (data = input.read()) != -1; gzip.write(data));
input.close(); gzip.close(); return output.toByteArray();
}
public static byte[] trim(InputStream input) throws IOException {
ByteArrayOutputStream output = new ByteArrayOutputStream();
BufferedReader reader = new BufferedReader(new InputStreamReader(input));
for (String line; (line = reader.readLine()) != null;) output.write(line.trim().getBytes());
reader.close(); output.close(); return output.toByteArray();
}
}
Another good reason not to minify your code is for learning. I love the ability to go and look through people's source code to see how they solve problems, and likewise I keep my source in full form so others can look at mine. I still have my code compressed through gzip before it's sent to the browser, but when it arrives it will be uncompressed into full form and fully readable.
I think few people do. Too much work, too little gain, esepcailly as the payload on HTTP can be zip-compressed these days.
Gzip compression that every modern web server and web server make HTML compression (minification) useless or almost insignificant.
So very few use this.
精彩评论