I’m using this code to read multiple remote files:
JavaScript
x
$filters = [ "https://example.com/file.txt", "https://example.com/file1.txt", "https://example.com/file3.txt" ]
function parseFilterLists( $filters )
{
foreach( $filters as $filter ){
$file = file_get_contents( $filter );
$parsed = preg_replace( '/!.*/', '', $file );
$parsed = preg_replace( '/||([wd]+(?:.[w]+)+)(?:[^$=~].*)/', '*://*.$1/*', $parsed );
}
$output = array_filter( explode( "n", $parsed ), function($url){
return preg_match('/^*://*.[wd-]+.[w]+/*$/', $url);
});
return array_values(array_unique($output));
}
I’ve noticed that the output content is truncated like just one file is processed, but what I need is to join the three files to manipulate them. How I can achive this?
Advertisement
Answer
The scope of $parsed
is limited to the foreach loop, so $output
only contains results from the last iteration of the loop. You should define $output
before the loop, and append the results of each iteration to it.
JavaScript
<?php
$filters = ["https://example.com/file.txt", "https://example.com/file1.txt", "https://example.com/file3.txt"];
function parseFilterLists($filters)
{
// Define a buffer for the output
$output = [];
foreach ($filters as $filter)
{
$file = file_get_contents($filter);
$parsed = preg_replace('/!.*/', '', $file);
$parsed = preg_replace('/||([wd]+(?:.[w]+)+)(?:[^$=~].*)/', '*://*.$1/*', $parsed);
// Get the output for the file we're working on in this iteration
$currOutput = array_filter(explode("n", $parsed), function ($url) {
return preg_match('/^*://*.[wd-]+.[w]+/*$/', $url);
});
// Append the output from the current file to the output buffer
$output = array_merge($output, $currOutput);
}
// Return the unique results
return array_unique($output);
}