Get site contents from the URL
I have a website which is deployed in an server withing my organization whos URL is
http://mysubsite.mysite.com/Folder1/Folder2/Default.aspx
Is their any way I can read all the pages exists in the folder. Lik开发者_如何学Goe the Folder 1 content or folder 2 content?
This can be done with wget (http://www.gnu.org/software/wget/).
Ex: wget -nc --page-requisites --domains mysubsite.mysite.com --no-parent mysubsite.mysite.com/Folder1/Folder2/ mysubsite.mysite.com/Folder1/Folder2/
-nc is no-clobber, meaning any existing files will not be overwritten --page-requisites will grab anything necessary for proper page viewing (css files, etc) --domains mysubsite.mysite.com will not follow any links outside of mysubsite.mysite.com --no-parent: don't follow links outside the directory /Folder1/Folder2/
Edit: Just re-read your question- I don't think you'd want to use the recursive flag. Stick with --no-parent and that will allow you to get everything in the /Folder1/Folder2/ directory.
HTH
No, you can't.
Sure, you could use wget as suggested by others, but that won't do what you're after. It will only read links and gather the files it can find.
In short, the client has no idea what files are out there on the server, and not all servers are going to tell it.
精彩评论