searching files in a site and copy them to an array
I want to search *.rules from this site http://cvs.snort.org/viewcvs.cgi/snort/rules/
and copy the strings of files into an array
how can I do this? for example:
$array=array('backdoor.rules','bad-traffic.rules');
I can use
$str=file_get_contents( 'http://cvs.sno开发者_运维百科rt.org/viewcvs.cgi/snort/rules');
but how can I search?
You need to use a regex to get any matches of the source
$url = 'http://cvs.snort.org/viewcvs.cgi/snort/rules/';
$contents = file_get_contents($url);
//Regex to match any links that have .rule in it
preg_match_all('/<a href="([^"]+\.rules[^"]*)">/i', $contents, $result, PREG_PATTERN_ORDER);
foreach ($result[1] as $link) {
//Fetch the link and save
}
This code will fetch the contents of the web page, and run a regex which matches any link that has .rules in it.
Inside the loop you can then save the files (using file_get_contents , and then fopen, fwrite and fclose)
1. Get the contents
$contents = file_get_contents('http://cvs.snort.org/viewcvs.cgi/snort/rules/');
2. Search and put the found files in $matches[1]
preg_match_all('/href="([^"]+\.rules)"/', $contents, $matches);
3. Do anything with the array
$matches[1];
精彩评论