but after it all scrapped and rdy to work in job checking i got like number 66k
the actual number of proxies without any duplicates should be like 14k when i tried to scrape them outside and did that i found it the rest are duplicates because every api might using some same proxies and few are different .
my idea is to keep them updating every period while removing duplicates through ob job tab auto while working on my job checking not manual by me scrapping them outside and put them in group and not remotely .
What I do is run a local webserver on the same PC as ob2, and I use a php script to scrape the proxies and write it to 3 files (http,sock4/5). That way you only need to add 3 sources to the job (eg. http://127.0.0.1/http.txt) but you have full control of what gets written to those files from any many sources as you can, I am going to create my own proxy check aswell as remove dupes before it writes to the proxy list.
Download the version suitable for your operating system (Windows, Linux, macOS).
Follow the installation instructions.
Start XAMPP:
Launch XAMPP Control Panel.
Start the Apache module by clicking “Start” next to Apache.
Prepare Your PHP Script:
Open a text editor (like Notepad++ or Visual Studio Code).
Copy the PHP script above into the editor.
Save the file as scrape_proxies.php in the htdocs directory of your XAMPP installation (e.g., C:\xampp\htdocs\ on Windows, /Applications/XAMPP/htdocs/ on macOS).
Run the PHP Script:
Open your web browser.
Navigate to http://localhost/scrape_proxies.php.
The script will fetch proxies, check them, remove duplicates, and write the valid ones to http.txt, socks4.txt, and socks5.txt.
If you want I can add a checkbox to dedupe proxies after loading them from the sources, just open an issue on github and that seems like a very reasonable thing to add.