开发者

Why use a whitelist for HTML sanitizing?

I've often wondered -- why use a whitelist as opposed to a blackl开发者_如何学运维ist when sanitizing HTML input?

How many sneaky HTML tricks are there to open XSS vulnerabilities? Obviously script tags and frames are not allowed, and a whitelist would be used on the fields in HTML elements, but why disallow most of everything?


If you leave something off a whitelist, then you just break something that wasn't important enough for you to think about in the first place.

If you leave something off a blacklist, then you've opened a big security hole.

If browsers add new features, then your blacklist becomes out of date.


Just read something about that yesterday. It's in the manual of feedparser.

A snippet:

The more I investigate, the more cases I find where Internet Explorer for Windows will treat seemingly innocuous markup as code and blithely execute it. This is why Universal Feed Parser uses a whitelist and not a blacklist. I am reasonably confident that none of the elements or attributes on the whitelist are security risks. I am not at all confident about elements or attributes that I have not explicitly investigated. And I have no confidence at all in my ability to detect strings within attribute values that Internet Explorer for Windows will treat as executable code. I will not attempt to preserve “just the good styles”. All styles are stripped.

There is a serious risk if you only blacklist some elements, and forget an important one. When you whitelist some tags you know are secure, the risk is smaller in letting something in which can be abused.


Even though script tags and frame tags are not allowed, you still can put any tag like this

<test onmouseover=alert(/XSS/)>mouse over this</test>

and many browsers works.


Because then you are sure that you don't miss anything. By explicitly allowing some tags you have obviously more control about what is allowed.

Whitelists are used in most security related topics. Think about firewalls. The first rule is to block any (incoming) traffic and then only open ports that are supposed to be open. This makes it far more secure.


Because other tags can break the layout of a page. Imagine what would happen if someone injects <style> tag. <object> tag is also dangerous.


I prefer to have both, I call it the "Black List with Relaxed White List" approach:

  1. Create a relaxed "White List" of tags & attributes.
  2. Create a "Black List for the White List", any tag/attribute in the black list SHOULD exist in the White List you created or else an error shows up.

This black list acts as an on-off switch for tags/attributes in the relaxed white list.

This "Black List with Relaxed White List" approach makes it much easier to configure the sanitizing filter.

As an example, the White List can contain all html5 tags and attributes. While the Black List can contain tags & attributes to be excluded.


The more you allow, the more tricks that a left for clever hackers to inject some nasty code into your webpage. That's why you want to allow as little as possible.

See Ruben van Vreeland's lecture How We Hacked LinkedIn & What Happened Next for a good introduction to XSS vulnerabilities and why you want your whitelist to be as strict as possible!

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜