开发者

Doxygen: HTML too large

I have a very large class, which is very documented. Doxygen is used for producing the HTML documentation, but for this class the HTML file is big ~12 MB, which is too much for my taste.

This happens becuase Doxygen list all documentation of 开发者_如何学Gothe class in one file, but in this case it's desiderable to split class functions documentation in separate pages, in order to have a reasonable page.

What I'm asking: is there any solution to this?

Maybe is there a special option/trick for this? Or... Maybe I can disable "normal" documentation for this class, and copy the class function documentation in a better organized page? In this case I shall @copydoc those functions, isn't?

What is you advice?


It is possible!

There is a configuration key: 'SEPARATE_MEMBER_PAGES=YES'. This will create a different page for each class member.

The next problem is the number of files that Doxygen generates (a filesystem nightmare), but this is solvable by setting 'CREATE_SUBDIRS=YES': at least this sparse files into directories, so make them manageable.


After few days...


No. The problem still exists... no, it is worse than ever: each separate class member page list all class member on the left: 4K member pages, for 1 MB each make... 4 GB? OMG.

So, I decided to remove that table listing (so much expansive) from the resulted HTML. Fortunately this section is uniform on all generated files. I made an utility program to achieve this. Here is the most usefull code snippet (using HtmlAgilityPack):

HtmlAgilityPack.HtmlNode divNode = htmlDoc.DocumentNode.SelectSingleNode("//body/div[@class='contents']/table/tr/td/div[@class='navtab']");

if (divNode != null) {
    divNode.ParentNode.RemoveChild(divNode, false);

    Console.WriteLine("Queued {0}", htmlPath);

    lock (sReducedHtmlPathsLock) {
        sReducedHtmlPaths.Add(htmlPath, htmlDoc);
        sReducedHtmlPathsSemaphore.Release();
    }
}

Of course I build a multithread program which load a bunch of HTML files on separate threads and write them in separate threads too (otherwise it take too long).

The result: passed from 4 GB to 60 MB!

Are you curious about the result: see the linked pages from this documentation page.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜