开发者

Preventing XSS attacks on user submitted HTML content in PHP, the eBay way

I've read so many articles describing methods to prevent XSS attacks in user submitted HTML content using functions such a htmlspecialchars and regexs, whitelisting/blacklisting and using html filtering scripts such as HTML Purifier, HTMLawed etc. etc.

Unfortunately, none of these explain how a site like eBay is able to allow such a 开发者_运维知识库vast array of potentially malicious HTML tags such as <link>, <script>, <object> and CSS styles and HTML attributes such as background: url() etc. It seems as if they allow users to submit pretty much anything into their item descriptions. I've seen some of the most elaborate HTML, Javascript and Flash templates in item descriptions.

What is eBay doing differently? Is there another technique or layer that I am missing, that allows them to block XSS attacks while still allowing pretty much anything to be included in user's item description?

Any ideas or insight into this would be greatly appreciated!


It's easy when you've got an army of programmers and a war chest full of money.

This isn't rocket science. They identify a vulnerability case and are coding around it likely via Regex and Javascript on the front-end as well as heavy back-end validation to ensure the data isn't compromised prior to insertion. It's the same that we should all be doing, except that for Ebay it's far more mature that what most of us work on, and FAR bigger.

If it's anything like the bank I used to work for, they have a PAS team that's dedicated to finding minute bugs in prod, opening tickets with engineers, and following the process through on a priority basis. Between developers, testers, quality management, and PAS, there's no reason a vulnerability should get out, but if it should happen to slip it should be reacted to quickly.

You should consider taking a "progressive enhancement" approach to this challenge if you plan to go this route. Start by blocking javascript flat out initially. Then, enhance to allow --some-- via a method you deem safe--and only allow what's safe as you continue. Continue this process allowing more and more while catching the edge cases as they come up in testing or production. Gradually, you'll migrate from allowing what IS allowed to blocking what ISN'T. While this should be a no brainer, even cutting edge companies miss the boat on the basic concept of lifecycle management and improvement.

That being said, when trying to sanitize input it's best to combine both front- and back-end validation methods. Front end provides more intuitive rapid feedback to clients, but as with any client-side language can be overcome by saavy users. Backend validation is your firewall, ensuring anything that slips past the frontend is dealt with appropriately. Your DB is your lifeline, so protect it at all costs!

Unless you have an army and a huge budget, trying to code for every edge case on something as broad as a CMS that allows near carte blanche input almost always ends up a losing financial venture.


I spent some time with it, and I was able to alert(document.cookie) -- something that their blacklists clearly try to prevent -- by building on a years-old exploit listed here:

http://menno.b10m.net/blog/blosxom.cgi/web/ebay-xss.html

I haven't actually posted the listing, so I can only say for certain that this works in the listing preview mode.

If this is any indication, then the answer to your question is this: ebay doesn't really prevent xss attacks. They have some blacklists in place, but they seem to be pretty trivial to evade.

Edit: I seem to have this working on my ebay "about me" page now, an actual live page. Long story short, ebay's attempts at filtering XSS appear to be pretty pathetic.


Although I can't say for sure, there are many techniques out there to help out. The major component for this to work is having an active and available security+development department.

Although they most certainly have a filter mechanism, code is probably evaluated in simulators and rated according to certain criteria (such as suspicious content, obfuscation etc).

Then, there's also the Mechanical Turk aspect, which Amazon is employing wildly (basically, they employ "cheap labour" humans to judge something intelligently).

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜