Closed. This question needs to be more focused. It is not currently accepting answers. Want to improve this question? Update the question so it focuses on one problem only by editing this
I have to make a decision for our (eXma german) community webpage. We will relauching it with a new system.
Im trying to grab a HTML source cod开发者_运维问答e from a webpage which needs user to login with username and password, the code are all done in the back end, i can not use \"?username=xx&pw=xxx\
I\'m in need of a way to se开发者_如何学JAVArve either HTML or links to external webpages. Basically what needs to be done is we will have webpages, and then would like to give a user of our site a UR
开发者_StackOverflow中文版I want to auto fill in a web page and then submit the data in C#. How can I do this?You will need to fake a POST HTTP-Request to the webserver. You will need the following
I am trying to pass through some XML from an external开发者_JS百科 website. What is the best way of doing this, through c# webpage or asp.MVC?I tend to use something like this for working with extern
I have a website http://www.bccfalna.com/ and the contents on this site are in HINDI Language. I want to make all these pages read only for peoples 开发者_开发百科so that they can not copy the content
Is it possible to have a c++ program like this... #include <iostream> using namespace std; int main ()
My question is there really an advantage by placing each webpage in it\'s own directory compared to putting them in a directory?
I wanna read out a server status webpage every x seconds. Site is: http://www.ffxiv-status.com/ how to do it easily and fast?