Community discussions

MikroTik App
 
Shumkov
just joined
Topic Author
Posts: 15
Joined: Tue Oct 01, 2019 9:08 pm

Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Tue Oct 01, 2019 11:00 pm

Hello!
The new parameter "output=user" provided new scripting capabilities that I decided to take full advantage of.

- the script does not need third-party servers, since address lists are downloaded directly from the source and processed directly on the router.

- the script does NOT save the downloaded files to the disk (thereby preventing premature wear and failure of the disk).

- the script can be adapted to download and process any number of address lists of a similar format (the maximum file size is 63 KiB (64512 bytes). It is better than 4 KiB).

At the moment the script can download and update next lists:
- DShield
- Spamhaus DROP
- Spamhaus EDROP
- Abuse.ch SSLBL

Variant 1:
ip firewall address-list
:local update do={
:do {
:local data ([:tool fetch url=$url output=user as-value]->"data")
remove [find list=blacklist comment=$description]
:while ([:len $data]!=0) do={
:if ([:pick $data 0 [:find $data "\n"]]~"^[0-9]{1,3}\\.[0-9]{1,3}\\.[0-9]{1,3}\\.[0-9]{1,3}") do={
:do {add list=blacklist address=([:pick $data 0 [:find $data $delimiter]].$cidr) comment=$description timeout=1d} on-error={}
}
:set data [:pick $data ([:find $data "\n"]+1) [:len $data]]
}
} on-error={:log warning "Address list <$description> update failed"}
}
$update url=https://www.dshield.org/block.txt description=DShield delimiter=("\t") cidr=/24
$update url=https://www.spamhaus.org/drop/drop.txt description="Spamhaus DROP" delimiter=("\_")
$update url=https://www.spamhaus.org/drop/edrop.txt description="Spamhaus EDROP" delimiter=("\_")
$update url=https://sslbl.abuse.ch/blacklist/sslipblacklist.txt description="Abuse.ch SSLBL" delimiter=("\r")
- the script deletes all addresses matching the condition "list=blacklist comment=$description", after which it fills out address lists from scratch. It's easier and faster.

Variant 2:
ip firewall address-list
:local update do={
:do {
:local data ([:tool fetch url=$url output=user as-value]->"data")
:local array [find dynamic list=blacklist]
:foreach value in=$array do={:set array (array,[get $value address])}
:while ([:len $data]!=0) do={
:if ([:pick $data 0 [:find $data "\n"]]~"^[0-9]{1,3}\\.[0-9]{1,3}\\.[0-9]{1,3}\\.[0-9]{1,3}") do={
:local ip ([:pick $data 0 [:find $data $delimiter]].$cidr)
:do {add list=blacklist address=$ip comment=$description timeout=1d} on-error={
:do {set ($array->([:find $array $ip]-[:len $array]/2)) timeout=1d} on-error={}
}
}
:set data [:pick $data ([:find $data "\n"]+1) [:len $data]]
}
} on-error={:log warning "Address list <$description> update failed"}
}
$update url=https://www.dshield.org/block.txt description=DShield delimiter=("\t") cidr=/24
$update url=https://www.spamhaus.org/drop/drop.txt description="Spamhaus DROP" delimiter=("\_")
$update url=https://www.spamhaus.org/drop/edrop.txt description="Spamhaus EDROP" delimiter=("\_")
$update url=https://sslbl.abuse.ch/blacklist/sslipblacklist.txt description="Abuse.ch SSLBL" delimiter=("\r")
- the script does NOT delete actual addresses, but prolongs their timeout. Addresses that are not in the downloadable list are deleted by the system automatically after their timeout. It's harder and slower, but it makes it possible to track the date/time of addresses added to the blacklist.
Why is the script using an "array"?
Because the default "find" function is VERY slow. Using an additional array allows to speed up the script several times, since operations are performed directly with the indexes, bypassing the default "find" function.

Required policy: read, write, test.
Perhaps this script will be useful to someone.

P.S. Sorry for my English.
Last edited by Shumkov on Sat Dec 12, 2020 6:22 pm, edited 15 times in total.
 
Zebble
Frequent Visitor
Frequent Visitor
Posts: 50
Joined: Mon Oct 17, 2011 4:07 am

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Fri Oct 18, 2019 12:12 am

Nice Work!

I added FireHOL Level2 to the script as well, in case you're interested. Just added this line:

$update url=https://raw.githubusercontent.com/ktsao ... el2.netset description="FireHOL Level2" delimiter=("\n")

-zeb
 
liuyao
just joined
Posts: 9
Joined: Wed Sep 04, 2019 9:14 am
Location: China

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Fri Oct 18, 2019 4:29 pm

Hello:

Thank you for sharing。 But the way you write functions is hard to understand. If any boss is rewritten, the written statement is perfect like the official example. Thank you
 
RackKing
Member
Member
Posts: 380
Joined: Wed Oct 09, 2013 1:59 pm

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Sun Nov 03, 2019 5:20 pm

Hi - This looks great. I will give it a try.

Update -
I just run this and it works great - no errors and works perfectly

What is general recommendation on how often to grab new lists - daily?

Am I correct it removes or ignores duplicate entries?

It would be great to keep this updated with additional!

Thank you so much for this!!!
Last edited by RackKing on Sun Nov 03, 2019 5:40 pm, edited 1 time in total.
 
msatter
Forum Guru
Forum Guru
Posts: 2941
Joined: Tue Feb 18, 2014 12:56 am
Location: Netherlands / Nīderlande

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Sun Nov 03, 2019 5:37 pm

How does it handle 1.2.3.0/24 addresses and as far I could it enters 1.2.3.0 in the addresslist without the /24?

Update: I ran the script and it does handles the range (cidr) correctly. Going to look if I can add some more lists.

Update 2: excellent script and I have added the option to filter on a specific label in file and that also can be used to remove a list that is not used anymore, from the current blacklist in the addresslist.
Last edited by msatter on Sun Nov 03, 2019 7:57 pm, edited 4 times in total.
 
RackKing
Member
Member
Posts: 380
Joined: Wed Oct 09, 2013 1:59 pm

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Sun Nov 03, 2019 5:42 pm

Nice Work!

I added FireHOL Level2 to the script as well, in case you're interested. Just added this line:

$update url=https://raw.githubusercontent.com/ktsao ... el2.netset description="FireHOL Level2" delimiter=("\n")

-zeb
This appears to fail for me.
 
RackKing
Member
Member
Posts: 380
Joined: Wed Oct 09, 2013 1:59 pm

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Sun Nov 03, 2019 7:50 pm

Nice Work!

I added FireHOL Level2 to the script as well, in case you're interested. Just added this line:

$update url=https://raw.githubusercontent.com/ktsao ... el2.netset description="FireHOL Level2" delimiter=("\n")

-zeb
This appears to fail for me.
This is the correct syntax
$update url=https://raw.githubusercontent.com/ktsaou/blocklist-ipsets/master/firehol_level2.netset description="FireHOL Level2" delimiter=("\n")
 
msatter
Forum Guru
Forum Guru
Posts: 2941
Joined: Tue Feb 18, 2014 12:56 am
Location: Netherlands / Nīderlande

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Sun Nov 03, 2019 7:51 pm

Nice Work!

I added FireHOL Level2 to the script as well, in case you're interested. Just added this line:

$update url=https://raw.githubusercontent.com/ktsao ... el2.netset description="FireHOL Level2" delimiter=("\n")

-zeb
This appears to fail for me.
It works if poster zeb put it as code here:
$update url=https://raw.githubusercontent.com/ktsaou/blocklist-ipsets/master/firehol_level2.netset description="FireHOL Level2" delimiter=("\n")

REALLY PLEASED with the script from Shumkov and the added option by Mikrotik and it is now very easy to import lists without having to use other computers to prepare the lists up front
 
RackKing
Member
Member
Posts: 380
Joined: Wed Oct 09, 2013 1:59 pm

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Sun Nov 03, 2019 8:00 pm

Nice Work!

I added FireHOL Level2 to the script as well, in case you're interested. Just added this line:

$update url=https://raw.githubusercontent.com/ktsao ... el2.netset description="FireHOL Level2" delimiter=("\n")

-zeb
This appears to fail for me.
It works if poster zeb put it as code here:
$update url=https://raw.githubusercontent.com/ktsaou/blocklist-ipsets/master/firehol_level2.netset description="FireHOL Level2" delimiter=("\n")

REALLY PLEASED with the script from Shumkov and the added option by Mikrotik and it is now very easy to import lists without having to use other computers to prepare the lists up front
That Level2 list is huge.... trying to sort the different levels they have. Any thoughts? Also, would you fun this daily?
 
Shumkov
just joined
Topic Author
Posts: 15
Joined: Tue Oct 01, 2019 9:08 pm

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Sun Nov 03, 2019 8:39 pm

Do not forget about file size - maximum 63 KiB.
If the file size is larger than the maximum, only part of the file will be processed (the first 63 KiB), and the rest of the file will be discarded.
FireHOL Level2 is bigger than 63 KiB :)
What is general recommendation on how often to grab new lists - daily?
I set the scheduler interval to 8 hours.
In general, the interval depends on the specific list and the frequency of updating this list by its provider.
it removes or ignores duplicate entries?
The script removes only addresses that are in the "blacklist" list and have a comment=description.
 
RackKing
Member
Member
Posts: 380
Joined: Wed Oct 09, 2013 1:59 pm

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Sun Nov 03, 2019 10:53 pm

Do not forget about file size - maximum 63 KiB.
If the file size is larger than the maximum, only part of the file will be processed (the first 63 KiB), and the rest of the file will be discarded.
FireHOL Level2 is bigger than 63 KiB :)
What is general recommendation on how often to grab new lists - daily?
I set the scheduler interval to 8 hours.
In general, the interval depends on the specific list and the frequency of updating this list by its provider.
it removes or ignores duplicate entries?
The script removes only addresses that are in the "blacklist" list and have a comment=description.
Ah - that makes sense. You are quite correct. Thanks for the explanation on the removal.

Are there any other lists you would consider or a good source?
 
msatter
Forum Guru
Forum Guru
Posts: 2941
Joined: Tue Feb 18, 2014 12:56 am
Location: Netherlands / Nīderlande

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Mon Nov 04, 2019 1:58 am

It would be nice if this would be possible using a filter to have only the needed data in the variable. So there would be a lot more space in the variable
:local data ([/tool fetch url=$url output=user as-value~"^[0-9]{1,3}.[0-9]{1,3}.[0-9]{1,3}.[0-9]{1,3}"]->"data");
 
RackKing
Member
Member
Posts: 380
Joined: Wed Oct 09, 2013 1:59 pm

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Mon Nov 04, 2019 2:42 am

Do not forget about file size - maximum 63 KiB.
If the file size is larger than the maximum, only part of the file will be processed (the first 63 KiB), and the rest of the file will be discarded.
FireHOL Level2 is bigger than 63 KiB :)
What is general recommendation on how often to grab new lists - daily?
I set the scheduler interval to 8 hours.
In general, the interval depends on the specific list and the frequency of updating this list by its provider.
it removes or ignores duplicate entries?
The script removes only addresses that are in the "blacklist" list and have a comment=description.
It looks like FireHOL Level1 may be a better choice and is under the file size limit.... barely. Any reason no to use this? That large of a list would probably have a pretty big performance hit on the router?

@Shumkov what was your goal/strategy based on the lists you choose? I am trying to sort what lists should be used and what is a happy medium.

Edit - after taking a closer look it appears the individual sources you are using is very similar to firehol_level1. With a goal of having no false positives this is a great place to start. I guess whether you grab them individually or through firehol is personal preference.

What a great script - thank you very much.
Last edited by RackKing on Mon Nov 04, 2019 3:34 am, edited 2 times in total.
 
RackKing
Member
Member
Posts: 380
Joined: Wed Oct 09, 2013 1:59 pm

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Mon Nov 04, 2019 2:56 am

malc0de

$update url=http://malc0de.com/bl/IP_Blacklist.txt description="Malc0de" delimiter=("\n")
 
Shumkov
just joined
Topic Author
Posts: 15
Joined: Tue Oct 01, 2019 9:08 pm

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Mon Nov 04, 2019 9:51 am

It would be nice if this would be possible using a filter to have only the needed data in the variable. So there would be a lot more space in the variable
This does not work :)
"data" is an element of the array, and is accepted for processing only in its entirety - you cannot process only part of the element.
@Shumkov what was your goal/strategy based on the lists you choose? I am trying to sort what lists should be used and what is a happy medium.

Edit - after taking a closer look it appears the individual sources you are using is very similar to firehol_level1.
That's right, I took FireHOL Level1 as the basis.
I removed “Feodo Tracker” and “Ransomware Tracker”, replaced “Bambenek C2” with “Bambenek High-Confidence C2” (as Bambenek recommended it myself), and also removed “Fullbogons” - I get them using BGP.
 
msatter
Forum Guru
Forum Guru
Posts: 2941
Joined: Tue Feb 18, 2014 12:56 am
Location: Netherlands / Nīderlande

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Mon Nov 04, 2019 10:38 am

It would be nice if this would be possible using a filter to have only the needed data in the variable. So there would be a lot more space in the variable
This does not work :)
"data" is an element of the array, and is accepted for processing only in its entirety - you cannot process only part of the element.
I agree and my angle is to filter traffic (stream) on the way to the data array.

Like this in scripting:
wget -q -O - $url | gawk --posix --field-separator=, '/^[0-9]{1,3}\.[0-9]{1,3}\.[0-9]{1,3}\.[0-9]{1,3}/ { print "$i a=" $1;}'  > $saveTo/$filename
This is something only Mikrotik can create to intercepting the stream.
 
RackKing
Member
Member
Posts: 380
Joined: Wed Oct 09, 2013 1:59 pm

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Mon Nov 04, 2019 2:29 pm

That's right, I took FireHOL Level1 as the basis.
I removed “Feodo Tracker” and “Ransomware Tracker”, replaced “Bambenek C2” with “Bambenek High-Confidence C2” (as Bambenek recommended it myself), and also removed “Fullbogons” - I get them using BGP.
Makes perfect sense. Thank you again so much for this.
 
RackKing
Member
Member
Posts: 380
Joined: Wed Oct 09, 2013 1:59 pm

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Mon Nov 04, 2019 4:00 pm

Is there a way to check the file size and have it trigger the email tool if it gets beyond the max file size?
 
Shumkov
just joined
Topic Author
Posts: 15
Joined: Tue Oct 01, 2019 9:08 pm

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Thu Nov 07, 2019 12:41 pm

Is there a way to check the file size and have it trigger the email tool if it gets beyond the max file size?
You can try this:
if (([tool fetch url=<url> output=user as-value]->"total")>63) do={tool e-mail send ...}
 
RackKing
Member
Member
Posts: 380
Joined: Wed Oct 09, 2013 1:59 pm

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Thu Nov 07, 2019 2:13 pm

Thanks you for that.

Do you have a dedicated link the fullbogons piece? I cannot seem to fined a direct url for it?
 
msatter
Forum Guru
Forum Guru
Posts: 2941
Joined: Tue Feb 18, 2014 12:56 am
Location: Netherlands / Nīderlande

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Thu Nov 07, 2019 2:41 pm

I tried it endless to find that and this great. I knew the "total" part but did not thought op putting that in the variable.
if (([:tool fetch url=$url output=user as-value]->"total")<64) do={:local data ([:tool fetch url={$url output=user as-value]->"data")} else= {tool e-mail send ...}
It did not work for me.
Last edited by msatter on Fri Nov 08, 2019 10:52 am, edited 1 time in total.
 
Shumkov
just joined
Topic Author
Posts: 15
Joined: Tue Oct 01, 2019 9:08 pm

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Fri Nov 08, 2019 9:30 am

Do you have a dedicated link the fullbogons piece? I cannot seem to fined a direct url for it?
Fullbogons_IPv4: http://www.team-cymru.org/Services/Bogo ... s-ipv4.txt
All bogon lists: https://www.team-cymru.com/bogon-reference-http.html
Bogons via BGP: https://www.team-cymru.com/bogon-reference-bgp.html
 
RackKing
Member
Member
Posts: 380
Joined: Wed Oct 09, 2013 1:59 pm

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Fri Nov 08, 2019 10:49 am

Do you have a dedicated link the fullbogons piece? I cannot seem to fined a direct url for it?
Fullbogons_IPv4: http://www.team-cymru.org/Services/Bogo ... s-ipv4.txt
All bogon lists: https://www.team-cymru.com/bogon-reference-http.html
Bogons via BGP: https://www.team-cymru.com/bogon-reference-bgp.html
Many thanks.
 
msatter
Forum Guru
Forum Guru
Posts: 2941
Joined: Tue Feb 18, 2014 12:56 am
Location: Netherlands / Nīderlande

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Fri Nov 08, 2019 2:10 pm

Do not insert lists that are bigger than 63KiB, those would only will be loaded incomplete.
# Written by Shumkov
# Adapted by blacklister
# 20201025
{
/ip firewall address-list
:local update do={
 :do {
 :local result [/tool fetch url=$url as-value output=user]; :if ($result->"downloaded" != "63") do={ :local data ($result->"data")
  :do { remove [find list=$blacklist] } on-error={}
   :while ([:len $data]!=0) do={
      :if ([:pick $data 0 [:find $data "\n"]]~"^[0-9]{1,3}\\.[0-9]{1,3}\\.[0-9]{1,3}\\.[0-9]{1,3}") do={
      :do {add list=$blacklist address=([:pick $data 0 [:find $data $delimiter]].$cidr) timeout=7d} on-error={}
      }
   :set data [:pick $data ([:find $data "\n"]+1) [:len $data]]
   } ;  :log warning "Imported address list < $blacklist> from file: $url"
   } else={:log warning "Address list: <$blacklist>, downloaded file to big: $url" }
 } on-error={:log warning "Address list <$blacklist> update failed"}
}

$update url=https://raw.githubusercontent.com/firehol/blocklist-ipsets/master/firehol_level1.netset blacklist="firehole-1" delimiter=("\n") 
$update url=https://raw.githubusercontent.com/ktsaou/blocklist-ipsets/master/firehol_level2.netset blacklist="firehole-2" delimiter=("\n") 
}
The first is loaded and the second is not because of the size, being over 63KiB

I use separate blacklists and not one blacklist with different comments.

Update: also using now:
~"^[0-9]{1,3}\\.[0-9]{1,3}\\.[0-9]{1,3}\\.[0-9]{1,3}"
Last edited by msatter on Sun Oct 25, 2020 10:06 am, edited 6 times in total.
 
RackKing
Member
Member
Posts: 380
Joined: Wed Oct 09, 2013 1:59 pm

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Fri Nov 08, 2019 4:21 pm

Do not insert lists that are bigger than 63KiB, those would only will be loaded incomplete.
# Written by Shumkov
# Adapted by blacklister
# 20191108

/ip firewall address-list
:local update do={
 :do {
 :local result [/tool fetch url=$url as-value output=user]; :if ($result->"downloaded" != "63") do={ :local data ($result->"data")
  :do { remove [find list=$blacklist] } on-error={}
   :while ([:len $data]!=0) do={
      :if (([:pick $data 0 [:find $data "\n"]]~"^[0-9]{1,3}.[0-9]{1,3}.[0-9]{1,3}.[0-9]{1,3}") do={
      :do {add list=$blacklist address=([:pick $data 0 [:find $data $delimiter]].$cidr) timeout=7d} on-error={}
      }
   :set data [:pick $data ([:find $data "\n"]+1) [:len $data]]
   } ;  :log warning "Imported address list < $blacklist> from file: $url"
   } else={:log warning "Address list: <$blacklist>, downloaded file to big: $url" }
 } on-error={:log warning "Address list <$blacklist> update failed"}
}

$update url=https://raw.githubusercontent.com/firehol/blocklist-ipsets/master/firehol_level1.netset blacklist="firehole-1" delimiter=("\n") 
$update url=https://raw.githubusercontent.com/ktsaou/blocklist-ipsets/master/firehol_level2.netset blacklist="firehole-2" delimiter=("\n") 
The first is loaded and the second is not because of the size being over 63KiB

I use separate blacklists and not one blacklist with different comments.
I gave this a shot - but it did not run. No message in the log and no address list.
 
msatter
Forum Guru
Forum Guru
Posts: 2941
Joined: Tue Feb 18, 2014 12:56 am
Location: Netherlands / Nīderlande

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Fri Nov 08, 2019 4:37 pm

Do not insert lists that are bigger than 63KiB, those would only will be loaded incomplete.
# Written by Shumkov
# Adapted by blacklister
# 20191108

/ip firewall address-list
:local update do={
 :do {
 :local result [/tool fetch url=$url as-value output=user]; :if ($result->"downloaded" != "63") do={ :local data ($result->"data")
  :do { remove [find list=$blacklist] } on-error={}
   :while ([:len $data]!=0) do={
      :if (([:pick $data 0 [:find $data "\n"]]~"^[0-9]{1,3}.[0-9]{1,3}.[0-9]{1,3}.[0-9]{1,3}") do={
      :do {add list=$blacklist address=([:pick $data 0 [:find $data $delimiter]].$cidr) timeout=7d} on-error={}
      }
   :set data [:pick $data ([:find $data "\n"]+1) [:len $data]]
   } ;  :log warning "Imported address list < $blacklist> from file: $url"
   } else={:log warning "Address list: <$blacklist>, downloaded file to big: $url" }
 } on-error={:log warning "Address list <$blacklist> update failed"}
}

$update url=https://raw.githubusercontent.com/firehol/blocklist-ipsets/master/firehol_level1.netset blacklist="firehole-1" delimiter=("\n") 
$update url=https://raw.githubusercontent.com/ktsaou/blocklist-ipsets/master/firehol_level2.netset blacklist="firehole-2" delimiter=("\n") 
The first is loaded and the second is not because of the size being over 63KiB

I use separate blacklists and not one blacklist with different comments.
I gave this a shot - but it did not run. No message in the log and no address list.
Remove one of the "(" in the line beginning with
:if (([:pick
 
hci
Long time Member
Long time Member
Posts: 679
Joined: Fri May 28, 2004 5:10 pm

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Fri Feb 28, 2020 12:53 am

I imagine the 63k limit is due to a variable size limit in Mikrotik scripting? It would be nice to be able to download larger blacklists.
 
User avatar
sjafka
Member Candidate
Member Candidate
Posts: 105
Joined: Wed Jan 03, 2018 5:45 pm

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Fri Feb 28, 2020 12:43 pm

This is beautifull mate! Thanks for your work! If you have a site with paypal donation, i would like to get you a beer! :D
PS.: I used before the squid blacklist, but the guy, who created it died (RIP m8 and thank you for your work!) last year, but it had like 30k entries, this has "only around 1500", but i see a lot of /24 subnets, so this is a huge list too!
 
User avatar
inteq
Member
Member
Posts: 429
Joined: Wed Feb 25, 2015 8:15 pm
Location: Romania

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Sun Mar 01, 2020 2:30 pm

PSA
Make sure you have whitelisted your private IPs if using https://raw.githubusercontent.com/fireh ... el1.netset
 
xenuc
just joined
Posts: 3
Joined: Mon Mar 02, 2020 8:28 am

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Mon Mar 02, 2020 9:07 am

The script is great, thanks. Now we just wait another 10 years to bypass the 65k limit.
 
HZsolt
newbie
Posts: 31
Joined: Tue Apr 24, 2018 7:31 pm

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Mon Mar 02, 2020 9:46 pm

63 --> 8192 and I downloaded the larger blacklist. But all lines did not load properly to the address-list.

https://raw.githubusercontent.com/fireh ... el1.netset
https://raw.githubusercontent.com/fireh ... el2.netset
https://raw.githubusercontent.com/fireh ... el3.netset
https://raw.githubusercontent.com/fireh ... el4.netset

How can I merge to one address-list the above addess-lists? I would like to use one address-list in the firewall of MikroTik instead of four address-lists. Fewer line in the firewall, faster processing and fewer load.
 
Shumkov
just joined
Topic Author
Posts: 15
Joined: Tue Oct 01, 2019 9:08 pm

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Tue Mar 03, 2020 5:06 pm

Variant 2:
ip firewall address-list
:local update do={
:do {
:local data ([:tool fetch url=$url output=user as-value]->"data")
:local array [find dynamic list=blacklist]
:foreach value in=$array do={:set array (array,[get $value address])}
:while ([:len $data]!=0) do={
:if ([:pick $data 0 [:find $data "\n"]]~"^[0-9]{1,3}\\.[0-9]{1,3}\\.[0-9]{1,3}\\.[0-9]{1,3}") do={
:local ip ([:pick $data 0 [:find $data $delimiter]].$cidr)
:do {add list=blacklist address=$ip comment=$description timeout=1d} on-error={
:do {set ($array->([:find $array $ip]-[:len $array]/2)) timeout=1d} on-error={}
}
}
:set data [:pick $data ([:find $data "\n"]+1) [:len $data]]
}
} on-error={:log warning "Address list <$description> update failed"}
}
$update url=https://feeds.dshield.org/block.txt description=DShield delimiter=("\t") cidr=/24
$update url=https://www.spamhaus.org/drop/drop.txt description="Spamhaus DROP" delimiter=("\_")
$update url=https://www.spamhaus.org/drop/edrop.txt description="Spamhaus EDROP" delimiter=("\_")
$update url=https://sslbl.abuse.ch/blacklist/sslipblacklist.txt description="Abuse.ch SSLBL" delimiter=("\r")
- the script does NOT delete actual addresses, but prolongs their timeout. Addresses that are not in the downloadable list are deleted by the system automatically after their timeout. It's harder and slower, but it makes it possible to track the date/time of addresses added to the blacklist.
Why is the script using an "array"?
Because the default "find" function is VERY slow. Using an additional array allows to speed up the script several times, since operations are performed directly with the indexes, bypassing the default "find" function.
Last edited by Shumkov on Sun Oct 25, 2020 7:52 am, edited 4 times in total.
 
HZsolt
newbie
Posts: 31
Joined: Tue Apr 24, 2018 7:31 pm

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Tue Mar 03, 2020 7:57 pm

Variant 2:
ip firewall address-list
:local update do={
:do {
:local data ([:tool fetch url=$url output=user as-value]->"data")
:local array [find dynamic list=blacklist]
:foreach value in=$array do={:set array (array,value,[get $value address])}
:while ([:len $data]!=0) do={
:if ([:pick $data 0 [:find $data "\n"]]~"^[0-9]{1,3}.[0-9]{1,3}.[0-9]{1,3}.[0-9]{1,3}") do={
:local ip ([:pick $data 0 [:find $data $delimiter]].$cidr)
:do {add list=blacklist address=$ip comment=$description timeout=1d} on-error={
:do {set ($array->([:find key=$ip in=$array]-1)) timeout=1d} on-error={}
}
}
:set data [:pick $data ([:find $data "\n"]+1) [:len $data]]
}
} on-error={:log warning "Address list <$description> update failed"}
}
$update url=http://feeds.dshield.org/block.txt description=DShield delimiter=("\t") cidr=/24
$update url=http://www.spamhaus.org/drop/drop.txt description="Spamhaus DROP" delimiter=("\_")
$update url=http://www.spamhaus.org/drop/edrop.txt description="Spamhaus EDROP" delimiter=("\_")
$update url=http://osint.bambenekconsulting.com/feeds/c2-ipmasterlist-high.txt description="Bambenek High-Confidence C2" delimiter=("\2C")
$update url=https://sslbl.abuse.ch/blacklist/sslipblacklist.txt description="Abuse.ch SSLBL" delimiter=("\r")
- the script does NOT delete actual addresses, but prolongs their timeout. Addresses that are not in the downloadable list are deleted by the system automatically after their timeout. It's harder and slower :), but it makes it possible to track the date/time of addresses added to the blacklist.
Why is the script using an "array"?
Because the default "find" function is VERY slow. Using an additional array allows to speed up the script several times, since operations are performed directly with the indexes, bypassing the default "find" function.
With the above script can I properly (full lists) download the below lists?
https://raw.githubusercontent.com/fireh ... el1.netset
https://raw.githubusercontent.com/fireh ... el2.netset
https://raw.githubusercontent.com/fireh ... el3.netset
https://raw.githubusercontent.com/fireh ... el4.netset
 
Shumkov
just joined
Topic Author
Posts: 15
Joined: Tue Oct 01, 2019 9:08 pm

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Tue Mar 03, 2020 8:15 pm

With the above script can I properly (full lists) download the below lists?
Download full lists - you can’t. 63KiB is a limitation of RouterOS, here scripts are powerless.
 
HZsolt
newbie
Posts: 31
Joined: Tue Apr 24, 2018 7:31 pm

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Tue Mar 03, 2020 8:20 pm

With the above script can I properly (full lists) download the below lists?
Download full lists - you can’t. 63KiB is a limitation of RouterOS, here scripts are powerless.
What is 63 KiB limitation of RouterOS?
 
msatter
Forum Guru
Forum Guru
Posts: 2941
Joined: Tue Feb 18, 2014 12:56 am
Location: Netherlands / Nīderlande

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Tue Mar 03, 2020 9:50 pm

My version checks for list larger than 63KiB and logs then if the list is loaded or not.

There no way to import a list bigger than that through an array.

Bigger lists can be used but that is an other story.
 
Krusty
Frequent Visitor
Frequent Visitor
Posts: 76
Joined: Fri May 02, 2008 11:14 pm

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Wed Mar 11, 2020 11:23 am

LifeSaver, thank you guys you are awesome
 
Shumkov
just joined
Topic Author
Posts: 15
Joined: Tue Oct 01, 2019 9:08 pm

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Thu Mar 12, 2020 2:05 pm

Bugfix:
- correct regexp is "^[0-9]{1,3}\\.[0-9]{1,3}\\.[0-9]{1,3}\\.[0-9]{1,3}"

The bug is not critical, it’s just that in some cases the script could process strings containing not only IP addresses, but simply numerical combinations similar in format.
 
User avatar
Xtreme512
Member Candidate
Member Candidate
Posts: 119
Joined: Sun Jun 08, 2014 2:43 pm
Location: Nicosia, CY
Contact:

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Wed Mar 25, 2020 4:14 am

Nice, very nice working script, thank you!

64 KB limit, on the other hand, is so annoying though... Gotta find a workaround, like maybe splitting files on-the-fly?
 
frantacech
just joined
Posts: 4
Joined: Tue Jul 25, 2017 6:55 pm

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Sat Apr 18, 2020 11:14 pm

Hello!
how can i import it?

for example aggregated ip from china?
https://www.ipdeny.com/ipblocks/
https://www.ipdeny.com/ipblocks/data/ag ... gated.zone

I try, but it doesn't work
 
shed909
just joined
Posts: 2
Joined: Sat Apr 25, 2020 5:59 am

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Sat Apr 25, 2020 6:02 am

ip firewall address-list
:local update do={
:do {
:local data ([:tool fetch url=$url output=user as-value]->"data")
remove [find list=blacklist comment=$description]
:while ([:len $data]!=0) do={
:if ([:pick $data 0 [:find $data "\n"]]~"^[0-9]{1,3}\\.[0-9]{1,3}\\.[0-9]{1,3}\\.[0-9]{1,3}") do={
:do {add list=blacklist address=([:pick $data 0 [:find $data $delimiter]].$cidr) comment=$description timeout=1d} on-error={}
}
:if ([:pick $data 0 [:find $data "\n"]]~"[a-z0-9]+([\\-\\.]{1}[a-z0-9]+)*\\.[a-z]{2,5}(:[0-9]{1,5})?(\\/.*)?") do={
:do {add list=blacklist address=([:pick $data 0 [:find $data $delimiter]].$cidr) comment=$description timeout=1d} on-error={}
}
:set data [:pick $data ([:find $data "\n"]+1) [:len $data]]
}
} on-error={:log warning "Address list <$description> update failed"}
}
$update url=http://feeds.dshield.org/block.txt description=DShield delimiter=("\t") cidr=/24
$update url=http://www.spamhaus.org/drop/drop.txt description="Spamhaus DROP" delimiter=("\_")
$update url=http://www.spamhaus.org/drop/edrop.txt description="Spamhaus EDROP" delimiter=("\_")
$update url=http://osint.bambenekconsulting.com/fee ... t-high.txt description="Bambenek High-Confidence C2" delimiter=("\2C")
$update url=https://sslbl.abuse.ch/blacklist/sslipblacklist.txt description="Abuse.ch SSLBL" delimiter=("\r")
$update url=http://malc0de.com/bl/IP_Blacklist.txt description="malc0de" delimiter=("\n")
$update url=https://raw.githubusercontent.com/ktsao ... el2.netset description="FireHOL Level2" delimiter=("\n")
$update url=https://raw.githubusercontent.com/fireh ... el1.netset description="FireHOL Level1" delimiter=("\n")
$update url=https://raw.githubusercontent.com/hecto ... g/list.txt description="hectorm adaway.org" delimiter=("\n")
Trying to add support for address lists containing the URL as apposed to IP, such as hectorm's lists for PiHole:
https://discourse.pi-hole.net/t/update- ... 2019/13620

However, some of the comments come out as the actual URL entry and the timeouts aren't set.
Last edited by shed909 on Sat Apr 25, 2020 7:13 am, edited 3 times in total.
 
shed909
just joined
Posts: 2
Joined: Sat Apr 25, 2020 5:59 am

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Sat Apr 25, 2020 6:04 am

Hello!
how can i import it?

for example aggregated ip from china?
https://www.ipdeny.com/ipblocks/
https://www.ipdeny.com/ipblocks/data/ag ... gated.zone

I try, but it doesn't work
Try:
$update url=https://www.ipdeny.com/ipblocks/data/ag ... gated.zone description="IPdeny cn-aggregated" delimiter=("\n")
 
pukka
just joined
Posts: 20
Joined: Sun Jun 26, 2011 4:05 pm

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Fri May 01, 2020 2:22 pm

How do we get around this 63KiB limit? can we ask mikrotik about this

We are trying to automate the download and add of

https://www.ipdeny.com/ipblocks/data/countries/gb.zone which is 124KiB
 
Lebzul
Member Candidate
Member Candidate
Posts: 110
Joined: Wed Feb 21, 2018 12:54 am

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Sat May 16, 2020 11:38 pm

Don't forget to add
{:delay 20};
at the beginning of the script to give time if running after reboot is needed.
 
User avatar
jvanhambelgium
Forum Guru
Forum Guru
Posts: 1114
Joined: Thu Jul 14, 2016 9:29 pm
Location: Belgium

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Sun May 17, 2020 8:55 am

How do we get around this 63KiB limit? can we ask mikrotik about this

We are trying to automate the download and add of

https://www.ipdeny.com/ipblocks/data/countries/gb.zone which is 124KiB
Perhaps the only way is to have some really smart script parse this list further into large(r) CIDR-blocks. So take several /24 "lines" and aggregate them further where there are adjacencies.
I've seen some sort of script here somewhere (used in another context) but it might be doable to gain a certain % of reduction.
 
msatter
Forum Guru
Forum Guru
Posts: 2941
Joined: Tue Feb 18, 2014 12:56 am
Location: Netherlands / Nīderlande

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Sun May 17, 2020 11:20 am

The problem is thst first have to read whole list before you can start reducing.

If Miktotik implement resume download then we could chop up the file in little parts.
 
User avatar
mozerd
Forum Veteran
Forum Veteran
Posts: 927
Joined: Thu Oct 05, 2017 3:39 pm
Location: Canada
Contact:

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Mon May 18, 2020 2:02 pm

According to the following Manual:Scripting-examples -- file size limitation has been removed
Read and write large files

Many users requested ability to work with files. Now you can do it without limitations

Create and write to file:

:global newContent "new file content\r\nanother line\r\n";
[/lua "local f=assert(io.open('/test.txt', 'w+')); f:write(newContent); f:close()" ];
Read file content to variable:

:global cnt ""
[/lua "local f=assert(io.open('/test.txt', 'r')); cnt=f:read('*all'); f:close()" ];
:put $cnt
I just found this wiki entry but I have not tried to adapt to blacklists ..... if this code actually works that would be excellent.
 
msatter
Forum Guru
Forum Guru
Posts: 2941
Joined: Tue Feb 18, 2014 12:56 am
Location: Netherlands / Nīderlande

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Mon May 18, 2020 3:34 pm

According to the following Manual:Scripting-examples -- file size limitation has been removed
Read and write large files

Many users requested ability to work with files. Now you can do it without limitations

Create and write to file:

:global newContent "new file content\r\nanother line\r\n";
[/lua "local f=assert(io.open('/test.txt', 'w+')); f:write(newContent); f:close()" ];
Read file content to variable:

:global cnt ""
[/lua "local f=assert(io.open('/test.txt', 'r')); cnt=f:read('*all'); f:close()" ];
:put $cnt
I just found this wiki entry but I have not tried to adapt to blacklists ..... if this code actually works that would be excellent.
Is /lua back then?
 
User avatar
jvanhambelgium
Forum Guru
Forum Guru
Posts: 1114
Joined: Thu Jul 14, 2016 9:29 pm
Location: Belgium

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Mon May 18, 2020 3:39 pm

Don't think so. That Wiki page states : This page was last edited on 18 October 2017, at 10:37.
As it says on the page : After RouterOS v4.0beta4, Lua support is removed until further notice
 
User avatar
mozerd
Forum Veteran
Forum Veteran
Posts: 927
Joined: Thu Oct 05, 2017 3:39 pm
Location: Canada
Contact:

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Mon May 18, 2020 4:38 pm

Don't think so. That Wiki page states : This page was last edited on 18 October 2017, at 10:37.
As it says on the page : After RouterOS v4.0beta4, Lua support is removed until further notice
My sincere apologies -- I did not see the part that After RouterOS v4.0beta4, Lua support is removed until further notice

What a shame, all that Lua stuff should be removed IMO BUT if certainly would be nice if MikroTik brought back LUA support or provided another means to work with any file size.
 
Lebzul
Member Candidate
Member Candidate
Posts: 110
Joined: Wed Feb 21, 2018 12:54 am

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Sat May 23, 2020 8:09 pm

Is there a reasonable way of bypassing Mk's limit or another approach?
I'm a Mk hardcore user but I'm considering other vendors if they do not apply a better concept to protect our equipments.

BTW, is there a way to have these working?
https://github.com/firehol/blocklist-ipsets/blob/master/firehol_level1.netset			40.8 KB
https://github.com/firehol/blocklist-ipsets/blob/master/normshield_all_wannacry.ipset	         6.15 KB
https://github.com/firehol/blocklist-ipsets/blob/master/normshield_all_bruteforce.ipset.     	4.64 KB
https://github.com/firehol/blocklist-ipsets/blob/master/dshield_30d.netset					2.17 KB
https://github.com/firehol/blocklist-ipsets/blob/master/spamhaus_edrop.netset				1.98 KB
https://github.com/firehol/blocklist-ipsets/blob/master/dshield_7d.netset					1.5 KB
https://github.com/firehol/blocklist-ipsets/blob/master/normshield_all_webscan.ipset	1.42 KB
https://github.com/firehol/blocklist-ipsets/blob/master/dshield.netset						1.04 KB
https://github.com/firehol/blocklist-ipsets/blob/master/normshield_all_wormscan.ipset	0.97 KB
https://github.com/firehol/blocklist-ipsets/blob/master/normshield_all_dnsscan.ipset	0.86 KB
Last edited by Lebzul on Sat May 23, 2020 10:25 pm, edited 2 times in total.
 
User avatar
jvanhambelgium
Forum Guru
Forum Guru
Posts: 1114
Joined: Thu Jul 14, 2016 9:29 pm
Location: Belgium

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Sat May 23, 2020 8:38 pm

Is there a reasonable way of bypassing Mk's limit or another approach?
I'm a Mk hardcore user but I'm considering other vendors if they do not apply a better concept to protect our equipments.
Nothing wrong with the concept I think. The idea of deploying such huge massive IP-lists and filter against them is something not infinitely possible also with other vendors.
Eg. Palo Alto networks.

A maximum of 10 External Block Lists (PanOS 7.x) on a PA-200
A maximum of 50000 IPs in all external lists combined. (1 list with 50000 IPs or 10 Lists with 5000 IPs both are supported)
If you use more than 10 EBLs in a device you will see the following error during commit:
Exceeding max number of supported external block lists (10)

In terms of harware limit

Hardware Maximum Address Entries
PA-220 : 2500
PA-820 : 2500
PA-850 : 3500
PA-3020 : 5000
PA-5020 :10000
PA-5220 : 40000
PA-7050 : 80000

So......


The only option is multiple cascaded lists that each remain within the boundary of 65K processing.
But indeed, you need some intermediate processing thing to properly "prepare" the file before download to the device, but that cannot be the show stopper I guess.
 
Lebzul
Member Candidate
Member Candidate
Posts: 110
Joined: Wed Feb 21, 2018 12:54 am

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Sat May 23, 2020 10:27 pm

Is there a reasonable way of bypassing Mk's limit or another approach?
I'm a Mk hardcore user but I'm considering other vendors if they do not apply a better concept to protect our equipments.
Nothing wrong with the concept I think. The idea of deploying such huge massive IP-lists and filter against them is something not infinitely possible also with other vendors.
Eg. Palo Alto networks.

A maximum of 10 External Block Lists (PanOS 7.x) on a PA-200
A maximum of 50000 IPs in all external lists combined. (1 list with 50000 IPs or 10 Lists with 5000 IPs both are supported)
If you use more than 10 EBLs in a device you will see the following error during commit:
Exceeding max number of supported external block lists (10)

In terms of harware limit

Hardware Maximum Address Entries
PA-220 : 2500
PA-820 : 2500
PA-850 : 3500
PA-3020 : 5000
PA-5020 :10000
PA-5220 : 40000
PA-7050 : 80000

So......


The only option is multiple cascaded lists that each remain within the boundary of 65K processing.
But indeed, you need some intermediate processing thing to properly "prepare" the file before download to the device, but that cannot be the show stopper I guess.
I see. And then, how do people do with servers with an open port? Let's say people need to access my server in a specific port?
 
User avatar
jvanhambelgium
Forum Guru
Forum Guru
Posts: 1114
Joined: Thu Jul 14, 2016 9:29 pm
Location: Belgium

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Sun May 24, 2020 12:05 am

Is there a reasonable way of bypassing Mk's limit or another approach?
I'm a Mk hardcore user but I'm considering other vendors if they do not apply a better concept to protect our equipments.
Nothing wrong with the concept I think. The idea of deploying such huge massive IP-lists and filter against them is something not infinitely possible also with other vendors.
Eg. Palo Alto networks.

A maximum of 10 External Block Lists (PanOS 7.x) on a PA-200
A maximum of 50000 IPs in all external lists combined. (1 list with 50000 IPs or 10 Lists with 5000 IPs both are supported)
If you use more than 10 EBLs in a device you will see the following error during commit:
Exceeding max number of supported external block lists (10)

In terms of harware limit

Hardware Maximum Address Entries
PA-220 : 2500
PA-820 : 2500
PA-850 : 3500
PA-3020 : 5000
PA-5020 :10000
PA-5220 : 40000
PA-7050 : 80000

So......


The only option is multiple cascaded lists that each remain within the boundary of 65K processing.
But indeed, you need some intermediate processing thing to properly "prepare" the file before download to the device, but that cannot be the show stopper I guess.
I see. And then, how do people do with servers with an open port? Let's say people need to access my server in a specific port?
It is important to take all considerations into account when you make the design. If "people" are in fact scattered across the world coming from virtually anyplace then perhaps you need to provide this service at another level. Eg. use some form of authentication with your users (possibly combined with VPN-application).
What is sitting behind this specific port ? Is this something that understand the concept of user-authentication ?
If you run a business and you know your users (eg. employees) are located in country X then filter strict and only allow country X IP which will reduce the surface already A LOT.
 
Lebzul
Member Candidate
Member Candidate
Posts: 110
Joined: Wed Feb 21, 2018 12:54 am

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Thu May 28, 2020 6:08 am

Nice Work!

I added FireHOL Level2 to the script as well, in case you're interested. Just added this line:

$update url=https://raw.githubusercontent.com/ktsao ... el2.netset description="FireHOL Level2" delimiter=("\n")

-zeb
Lv1 was working fine and now it is not. Probably it does not fit anymore.
 
User avatar
mozerd
Forum Veteran
Forum Veteran
Posts: 927
Joined: Thu Oct 05, 2017 3:39 pm
Location: Canada
Contact:

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Thu May 28, 2020 1:15 pm

Lv1 was working fine and now it is not. Probably it does not fit anymore.
You [everyone] should be aware that:

level1 check frequency = 1 minute and average update frequency = 2 hours and 27 minutes
level2 check frequency = 1 minute and average update frequency = 17 minutes
level3 check frequency = 1 minute and average update frequency = 45 minutes
level4 check frequency = 1 minute and average update frequency = 44 minutes
webclient check frequency = 1 minute and average update frequency = 8 hours and 36 minutes
webserver check frequency = 1 minute and average update frequency = 23 hours and 16 minutes

So why is this important to note?
Because changes [adds/deletions] are frequent and that can have a dramatic change in file size.
Also of importance to note is that many duplicates reside when lists are combined - so your processing engine needs to remove duplicates and then reorders them for faster processing.

Depending on which MikroTik Router model being used MOAB combines some of these lists or ALL of these lists 3 times each day spaced 8 hours apart.
 
kevinds
Long time Member
Long time Member
Posts: 657
Joined: Wed Jan 14, 2015 8:41 am

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Fri May 29, 2020 3:18 pm

What is the recommended way to find out *why* an update failed?

Address list <Spamhaus DROP> update failed

Is great to see in the logs, but where do I look to try and figure out why it failed?

Spamhaus DROP and EDROP are not over 63 kb, so that isn't the reason..

At the moment I am focusing on the IPs used for email SPAM, but it doesn't really matter.. I have the Spamhaus and Bambenek lists failing but I don't know why.
 
Shumkov
just joined
Topic Author
Posts: 15
Joined: Tue Oct 01, 2019 9:08 pm

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Fri May 29, 2020 9:52 pm

What is the recommended way to find out *why* an update failed?

Address list <Spamhaus DROP> update failed

Is great to see in the logs, but where do I look to try and figure out why it failed?
This error occurs if the file is for some reason not available for download. The address list does not load SOMETIMES? Or always?
 
kevinds
Long time Member
Long time Member
Posts: 657
Joined: Wed Jan 14, 2015 8:41 am

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Fri May 29, 2020 10:42 pm


This error occurs if the file is for some reason not available for download. The address list does not load SOMETIMES? Or always?
They don't load always.
 
User avatar
mozerd
Forum Veteran
Forum Veteran
Posts: 927
Joined: Thu Oct 05, 2017 3:39 pm
Location: Canada
Contact:

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Sat May 30, 2020 3:57 pm


This error occurs if the file is for some reason not available for download. The address list does not load SOMETIMES? Or always?
They don't load always.
@kevinds
You should be aware that when loading lists IF a duplicate IP is present the list will not load and processing stops.
So it is critical that duplicate IP be avoided via a pre-process that first checks for duplicates, removes the duplicates, reorders [sorts] the list for faster processing then proceeds with the load.
 
Shumkov
just joined
Topic Author
Posts: 15
Joined: Tue Oct 01, 2019 9:08 pm

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Sat May 30, 2020 7:25 pm

You should be aware that when loading lists IF a duplicate IP is present the list will not load and processing stops.
Script ignores duplicates via on-error={}, processing is not interrupted.
 
User avatar
mozerd
Forum Veteran
Forum Veteran
Posts: 927
Joined: Thu Oct 05, 2017 3:39 pm
Location: Canada
Contact:

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Sat May 30, 2020 8:37 pm

You should be aware that when loading lists IF a duplicate IP is present the list will not load and processing stops.
Script ignores duplicates via on-error={}, processing is not interrupted.
Do you mean this line: on-error={:log warning "Address list <$description> update failed"} ?

Where in your script do you check for duplicate ip?
 
kevinds
Long time Member
Long time Member
Posts: 657
Joined: Wed Jan 14, 2015 8:41 am

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Sat May 30, 2020 8:44 pm


Do you mean this line: on-error={:log warning "Address list <$description> update failed"} ?
comment=$description timeout=1d} on-error={}
 
User avatar
mozerd
Forum Veteran
Forum Veteran
Posts: 927
Joined: Thu Oct 05, 2017 3:39 pm
Location: Canada
Contact:

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Sun May 31, 2020 3:40 pm

comment=$description timeout=1d} on-error={}
Thanks .... I just tested @Shumkov code and it works very nicely .... excellent work.
 
kevinds
Long time Member
Long time Member
Posts: 657
Joined: Wed Jan 14, 2015 8:41 am

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Sun May 31, 2020 4:38 pm

Thanks .... I just tested @Shumkov code and it works very nicely .... excellent work.
Yeah, I have a couple honeypot IPs that when hit, adds the IP to a drop rule, then a script that runs that expands the /32 to a larger block.. I needed something similar to handle multiple IPs from the same larger block.. For when asshats decide to use an entire /16 to do a port-scan of every port.. lol

That was interesting to watch.. haha

But yeah, I still have lots to learn, but I'm not sure how to get a better log for why both variations are failing.
 
msatter
Forum Guru
Forum Guru
Posts: 2941
Joined: Tue Feb 18, 2014 12:56 am
Location: Netherlands / Nīderlande

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Mon Jun 01, 2020 11:55 pm

My version:
   } ;  :log warning "Imported address list < $blacklist> from file: $url"
   } else={:log warning "Address list: <$blacklist>, downloaded file to big: $url" }
 } on-error={:log warning "Address list <$blacklist> update failed"}
Collecting ranges of IP addresses that are knocking at the door: viewtopic.php?f=2&t=152953&p=758068&hil ... os#p758068
 
kevinds
Long time Member
Long time Member
Posts: 657
Joined: Wed Jan 14, 2015 8:41 am

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Tue Jun 02, 2020 12:05 am

My version:
   } ;  :log warning "Imported address list < $blacklist> from file: $url"
   } else={:log warning "Address list: <$blacklist>, downloaded file to big: $url" }
 } on-error={:log warning "Address list <$blacklist> update failed"}
Collecting ranges of IP addresses that are knocking at the door: viewtopic.php?f=2&t=152953&p=758068&hil ... os#p758068
Ok.. Reading this...

It downloads the list.

It tries to import it

If successful, gives a successful message,
If import fails it says too big..

So if import fails for any reason, it says too big, what if it fails for another reason?

I don't see your version checking it's size beforehand, so the error message could say 'Failed because a butterfly flapped it's wings..' and would still be a more useful error message (because it wouldn't be stating an incorrect reason for it to fail). ;)

I hope I am wrong reading this.. If I am, I am very sorry.
 
kevinds
Long time Member
Long time Member
Posts: 657
Joined: Wed Jan 14, 2015 8:41 am

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Tue Jun 02, 2020 12:14 am

Collecting ranges of IP addresses that are knocking at the door: viewtopic.php?f=2&t=152953&p=758068&hil ... os#p758068
I do something very similar to the linked thread..

I have honey-pot IP addresses, anything that attempts to connect to them, gets their IP added to the block list, these addresses have never been used, so nothing legitimate would have any reason to try and connect.

Then another script runs and turns them into a /24, with a 7 day timeout..

Usually, the router has 60-75k addresses in the list at any time. After a reboot the list is reset, takes 6-12 to get back up there.
 
User avatar
Jotne
Forum Guru
Forum Guru
Posts: 3343
Joined: Sat Dec 24, 2016 11:17 am
Location: Magrathean

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Tue Jun 02, 2020 8:28 am

I have honey-pot IP addresses, anything that attempts to connect to them, gets their IP added to the block list, these addresses have never been used, so nothing legitimate would have any reason to try and connect.
I do nearly the same. Since I do not have an extra public IP, I have and access rule that if any tries to connect to a port that is not open, they get blocked to all ports (65535-6 ports) , also normally open port (6 ports) for 24 hour.
 
Sigma

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Sun Jun 07, 2020 10:01 pm

Have to say thank you OP for the Script works great out-of-the-box. :D

Sincerely
Sigma
 
User avatar
mac86
Member Candidate
Member Candidate
Posts: 126
Joined: Sat Nov 25, 2006 12:52 am
Location: bahia blanca - argentina
Contact:

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Sat Jun 20, 2020 12:38 am

Hello!
...........

EXCELENT POST.
THANK YOU!!!
 
Lebzul
Member Candidate
Member Candidate
Posts: 110
Joined: Wed Feb 21, 2018 12:54 am

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Sat Jul 18, 2020 3:38 pm

comment=$description timeout=1d} on-error={}
Thanks .... I just tested @Shumkov code and it works very nicely .... excellent work.
If this aforementioned line is like that, then the OP has a typo.
 
faxxe
newbie
Posts: 40
Joined: Wed Dec 12, 2018 1:46 pm

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Sat Oct 24, 2020 6:48 pm

Hi
i tryed the different scripts but get on all lists "Address list <name of the list> update failed"
CCR1009 v6.46.7

What could be wrong?

-faxxe
 
msatter
Forum Guru
Forum Guru
Posts: 2941
Joined: Tue Feb 18, 2014 12:56 am
Location: Netherlands / Nīderlande

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Sat Oct 24, 2020 9:29 pm

Hi
i tryed the different scripts but get on all lists "Address list <name of the list> update failed"
CCR1009 v6.46.7
What could be wrong?
-faxxe
Do you have by any chance spaces or special characters in the names of the lists?
 
faxxe
newbie
Posts: 40
Joined: Wed Dec 12, 2018 1:46 pm

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Sat Oct 24, 2020 11:09 pm


Do you have by any chance spaces or special characters in the names of the lists?
In which lists? I have to define them before running the scripts? :/ Maybe i use it wrong ...

-faxxe
 
msatter
Forum Guru
Forum Guru
Posts: 2941
Joined: Tue Feb 18, 2014 12:56 am
Location: Netherlands / Nīderlande

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Sun Oct 25, 2020 12:08 am

It is indeed a bit confusing. Original there was one address-list named blacklist and the desciption/comment separated the different imported address-list.

Please post the scipt you use then can have a look at it.
 
faxxe
newbie
Posts: 40
Joined: Wed Dec 12, 2018 1:46 pm

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Sun Oct 25, 2020 12:40 am

Please post the scipt you use then can have a look at it.
ip firewall address-list
:local update do={
:do {
:local data ([:tool fetch url=$url output=user as-value]->"data")
:local array [find dynamic list=blacklist]
:foreach value in=$array do={:set array (array,[get $value address])}
:while ([:len $data]!=0) do={
:if ([:pick $data 0 [:find $data "\n"]]~"^[0-9]{1,3}\\.[0-9]{1,3}\\.[0-9]{1,3}\\.[0-9]{1,3}") do={
:local ip ([:pick $data 0 [:find $data $delimiter]].$cidr)
:do {add list=blacklist address=$ip comment=$description timeout=1d} on-error={
:do {set ($array->([:find $array $ip]-[:len $array]/2)) timeout=1d} on-error={}
}
}
:set data [:pick $data ([:find $data "\n"]+1) [:len $data]]
}
} on-error={:log warning "Address list <$description> update failed"}
}
$update url=http://feeds.dshield.org/block.txt description=DShield delimiter=("\t") cidr=/24
$update url=http://www.spamhaus.org/drop/drop.txt description="Spamhaus DROP" delimiter=("\_")
$update url=http://www.spamhaus.org/drop/edrop.txt description="Spamhaus EDROP" delimiter=("\_")
$update url=http://osint.bambenekconsulting.com/feeds/c2-ipmasterlist-high.txt description="Bambenek High-Confidence C2" delimiter=("\2C")
$update url=https://sslbl.abuse.ch/blacklist/sslipblacklist.txt description="Abuse.ch SSLBL" delimiter=("\r")
Thank you, msatter :)

-faxxe
 
msatter
Forum Guru
Forum Guru
Posts: 2941
Joined: Tue Feb 18, 2014 12:56 am
Location: Netherlands / Nīderlande

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Sun Oct 25, 2020 1:33 am

Quick check. The first line you are changing to /ip firewall address-list but you not copied the needed a "/" in front when already being already in a menu.

Update: I see that all omit this "/" and this works if you are already in the root of the menus. I always put a "/" in front to be sure I land where I need, every time, where ever I am.

This list contains no IP addresses anymore and should be removed from your list.
http://osint.bambenekconsulting.com/feeds/c2-ipmasterlist-high.txt
 
faxxe
newbie
Posts: 40
Joined: Wed Dec 12, 2018 1:46 pm

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Sun Oct 25, 2020 9:02 am

Update: I see that all omit this "/" and this works if you are already in the root of the menus. I always put a "/" in front to be sure I land where I need, every time, where ever I am.
Thank you, i add the "/" to the first line but with the same result. All lists cant get updated

-faxxe
 
msatter
Forum Guru
Forum Guru
Posts: 2941
Joined: Tue Feb 18, 2014 12:56 am
Location: Netherlands / Nīderlande

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Sun Oct 25, 2020 10:05 am

Update: I see that all omit this "/" and this works if you are already in the root of the menus. I always put a "/" in front to be sure I land where I need, every time, where ever I am.
Thank you, i add the "/" to the first line but with the same result. All lists cant get updated
-faxxe
Are you running the code directly in terminal or do you as intented put it in a script box and run then the script?

To test I have put "{" at the beginning and at the end a "}" in my version and check if you can run that directly in Terminal. This will also work in a script box (/system script)

viewtopic.php?f=9&t=152632&p=824755#p759427

Remark: on the moment both files are smaller than 64KB so they load with not problem.

It could also be the case that your firewall settings don't allow to download directly from the router. Test this with this in Terminal:
 /tool fetch url=http://feeds.dshield.org/block.txt  as-value output=user 
If you get: failure: connection timeout then your firewall is blocking.
 
faxxe
newbie
Posts: 40
Joined: Wed Dec 12, 2018 1:46 pm

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Sun Oct 25, 2020 11:06 am

It could also be the case that your firewall settings don't allow to download directly from the router. Test this with this in Terminal:
 /tool fetch url=http://feeds.dshield.org/block.txt  as-value output=user 
If you get: failure: connection timeout then your firewall is blocking.
Sir, you are entitled to a beer at my expense :) That's the problem. Connection timeout .....
I can use the ping command but I cannot download anything to the router. I have to solve this now.
Many thanks for your patience and helpfulness....
-faxxe
 
kevinds
Long time Member
Long time Member
Posts: 657
Joined: Wed Jan 14, 2015 8:41 am

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Tue Oct 27, 2020 6:45 am

Alright, I made some tweaks to allow logins to the download servers, in case the IP list you want to download is password protected..

Variant #1:
/ip firewall address-list
:local update do={
:do {
:local data ([:tool fetch url=$url user=$user password=$password output=user as-value]->"data")
remove [find list=blacklist comment=$description]
:while ([:len $data]!=0) do={
:if ([:pick $data 0 [:find $data "\n"]]~"^[0-9]{1,3}\\.[0-9]{1,3}\\.[0-9]{1,3}\\.[0-9]{1,3}") do={
:do {add list=blacklist address=([:pick $data 0 [:find $data $delimiter]].$cidr) comment=$description timeout=1d} on-error={}
}
:set data [:pick $data ([:find $data "\n"]+1) [:len $data]]
}
} on-error={:log warning "Address list <$description> update failed"}
}
$update url=https://feeds.dshield.org/block.txt user="anonymous" password="anonymous" description=DShield delimiter=("\t") cidr=/24
$update url=https://www.spamhaus.org/drop/drop.txt user="anonymous" password="anonymous" description="Spamhaus DROP" delimiter=("\_")
$update url=https://www.spamhaus.org/drop/edrop.txt user="anonymous" password="anonymous" description="Spamhaus EDROP" delimiter=("\_")
$update url=https://sslbl.abuse.ch/blacklist/sslipblacklist.txt user="anonymous" password="anonymous" description="Abuse.ch SSLBL" delimiter=("\r")

Variant #2:
/ip firewall address-list
:local update do={
:do {
:local data ([:tool fetch url=$url user=$user password=$password output=user as-value]->"data")
:local array [find dynamic list=blacklist]
:foreach value in=$array do={:set array (array,[get $value address])}
:while ([:len $data]!=0) do={
:if ([:pick $data 0 [:find $data "\n"]]~"^[0-9]{1,3}\\.[0-9]{1,3}\\.[0-9]{1,3}\\.[0-9]{1,3}") do={
:local ip ([:pick $data 0 [:find $data $delimiter]].$cidr)
:do {add list=blacklist address=$ip comment=$description timeout=1d} on-error={
:do {set ($array->([:find $array $ip]-[:len $array]/2)) timeout=1d} on-error={}
}
}
:set data [:pick $data ([:find $data "\n"]+1) [:len $data]]
}
} on-error={:log warning "Address list <$description> update failed"}
}
$update url=http://feeds.dshield.org/block.txt user="anonymous" password="anonymous" description=DShield delimiter=("\t") cidr=/24
$update url=http://www.spamhaus.org/drop/drop.txt user="anonymous" password="anonymous" description="Spamhaus DROP" delimiter=("\_")
$update url=http://www.spamhaus.org/drop/edrop.txt user="anonymous" password="anonymous" description="Spamhaus EDROP" delimiter=("\_")
$update url=https://sslbl.abuse.ch/blacklist/sslipblacklist.txt user="anonymous" password="anonymous" description="Abuse.ch SSLBL" delimiter=("\r")
I don't know if the "user" and "password" are required to be set to "anonymous" on each update line, I did in case it sends the value from a previous line, to know/predict exactly what it is doing.

I figured out why the two variants were failing for me above.. It is/was a common complaint/issue I have with RouterOS, it was using the wrong Source-IP address.
 
sachlj
just joined
Posts: 2
Joined: Fri Oct 30, 2020 10:11 am

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Fri Oct 30, 2020 3:05 pm

$update url=http://feeds.dshield.org/block.txt description="DShield" delimiter=("\t") cidr=/24
$update url=http://www.spamhaus.org/drop/drop.txt description="Spamhaus DROP" delimiter=("\_")
$update url=http://www.spamhaus.org/drop/edrop.txt description="Spamhaus EDROP" delimiter=("\_")
$update url=https://osint.bambenekconsulting.com/fe ... t-high.txt description="Bambenek High-Confidence C2" delimiter=("\2C")
$update url=https://sslbl.abuse.ch/blacklist/sslipblacklist.txt description="Abuse.ch SSLBL" delimiter=("\r")
$update url=http://malc0de.com/bl/IP_Blacklist.txt description="malc0de" delimiter=("\n")
$update url=https://iplists.firehol.org/files/firehol_level2.netset description="FireHOL Level2" delimiter=("\n")
$update url=https://iplists.firehol.org/files/firehol_level1.netset description="FireHOL Level1" delimiter=("\n")
$update url=https://raw.githubusercontent.com/hecto ... g/list.txt description="hectorm adaway.org" delimiter=("\n")
..........................
$update url=https://raw.githubusercontent.com/hecto ... g/list.txt description="hectorm adaway.org" delimiter=("\n")
https://gist.github.com/sathwikv143/d2a ... 38342ef455
............................
 
kevinds
Long time Member
Long time Member
Posts: 657
Joined: Wed Jan 14, 2015 8:41 am

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Mon Nov 02, 2020 1:19 pm

$update url=http://feeds.dshield.org/block.txt description="DShield" delimiter=("\t") cidr=/24
$update url=http://www.spamhaus.org/drop/drop.txt description="Spamhaus DROP" delimiter=("\_")
$update url=http://www.spamhaus.org/drop/edrop.txt description="Spamhaus EDROP" delimiter=("\_")
$update url=https://osint.bambenekconsulting.com/fe ... t-high.txt description="Bambenek High-Confidence C2" delimiter=("\2C")
$update url=https://sslbl.abuse.ch/blacklist/sslipblacklist.txt description="Abuse.ch SSLBL" delimiter=("\r")
$update url=http://malc0de.com/bl/IP_Blacklist.txt description="malc0de" delimiter=("\n")
$update url=https://iplists.firehol.org/files/firehol_level2.netset description="FireHOL Level2" delimiter=("\n")
$update url=https://iplists.firehol.org/files/firehol_level1.netset description="FireHOL Level1" delimiter=("\n")
$update url=https://raw.githubusercontent.com/hecto ... g/list.txt description="hectorm adaway.org" delimiter=("\n")
..........................
$update url=https://raw.githubusercontent.com/hecto ... g/list.txt description="hectorm adaway.org" delimiter=("\n")
https://gist.github.com/sathwikv143/d2a ... 38342ef455
............................
Why all of these? Did you check what any of them offer? More than one of these has been taken down/offline. At least one of them is too big for RouterOS as well, at least one is a list of IP list URLs, not an IP list that can be imported.
 
nickcarr
just joined
Posts: 13
Joined: Tue Jul 13, 2021 6:43 pm

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Wed Jul 14, 2021 5:04 pm

Hello!
The new parameter "output=user" provided new scripting capabilities that I decided to take full advantage of.
....

P.S. Sorry for my English.
Thanks for posting it. And also to other Ppl.
Great job
 
elstiv73
just joined
Posts: 10
Joined: Wed Jun 10, 2020 9:34 am

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Thu Aug 05, 2021 2:51 pm

Many thanks for this brilliant script. Should this be run via the scheduler (interval) every day ? I am asking because under the column 'Timeout' there is a 24 hour countdown timer. I am not sure whether the script refreshes itself automatically every 24 hours or whether we should refresh it via scheduler interval. I am using the first variant of the code. Thanks
 
User avatar
rextended
Forum Guru
Forum Guru
Posts: 12558
Joined: Tue Feb 25, 2014 12:49 pm
Location: Italy
Contact:

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Thu Aug 05, 2021 3:08 pm

I figured out why the two variants were failing for me above.. It is/was a common complaint/issue I have with RouterOS, it was using the wrong Source-IP address.

/tool fetch src-address=
 
User avatar
rextended
Forum Guru
Forum Guru
Posts: 12558
Joined: Tue Feb 25, 2014 12:49 pm
Location: Italy
Contact:

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Thu Aug 05, 2021 5:24 pm

search tag # rextended definitive ip posix regex

remember than this are wroten for be put directly on script,
if tested on terminal you must add \ before the ?
if tested on regex101 must be removed \ before \. and \/


The used regexp
[0-9]{1,3}\\.[0-9]{1,3}\\.[0-9]{1,3}\\.[0-9]{1,3}
match from 0.0.0.0 to 999.999.999.999

My POSIX regex is better, also match subnet mask, if present:
([0-2]{0,1}[0-9]{1,2}\\.){3}[0-2]{0,1}[0-9]{1,2}(\\/[0-3]{0,1}[0-9]{1,1}){0,1}
0.0.0.0 to 299.299.299.299
000.000.000.000 to 299.299.299.299
xxxx/0 to /39
xxxx/00 to /39


Correct regex to match exactly from 0.0.0.0/0 (or 000.000.000.000/00) to 255.255.255.255/32
is too much complicated and the CPU go 100% until all lists are parsed...
with mandatory subnet mask
((25[0-5]|(2[0-4]|[01]?[0-9]?)[0-9])\\.){3}(25[0-5]|(2[0-4]|[01]?[0-9]?)[0-9])\\/(3[0-2]|[0-2]?[0-9])

with optional subnet mask
((25[0-5]|(2[0-4]|[01]?[0-9]?)[0-9])\\.){3}(25[0-5]|(2[0-4]|[01]?[0-9]?)[0-9])(\\/(3[0-2]|[0-2]?[0-9])){0,1}

without subnet mask
((25[0-5]|(2[0-4]|[01]?[0-9]?)[0-9])\\.){3}(25[0-5]|(2[0-4]|[01]?[0-9]?)[0-9])
can be checked using :toip, if the result checked from :typeof is not "ip" or "ip-prefix", is not a valid IP or IP/Prefix

also must be skipped some IPs for not self-block all own networks if for errors (or not) own LAN IP or WAN IP go on blacklist...
 
msatter
Forum Guru
Forum Guru
Posts: 2941
Joined: Tue Feb 18, 2014 12:56 am
Location: Netherlands / Nīderlande

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Sun Aug 22, 2021 3:51 am

A Frankenstein script using HTTP chunking. It is not perfect because it can't predict where the splits are made so you will miss some data or data is wrong. But is a start and certainly it can be improved. This ofcourse only usable if the HTTP server supports chunking.
USE LATER VERSION PUBLISHED BELOW
Inspiration: viewtopic.php?f=9&t=177530

Update: Chunking problem solved by using a negative overlap of 512 bytes for each part. I first reduced the maxsize with 512 bytes so they are in sync.
Update 2: removing the first array line of each chunk to avoid importing incomplete lines.
Last edited by msatter on Tue Aug 24, 2021 10:26 pm, edited 8 times in total.
 
User avatar
jvanhambelgium
Forum Guru
Forum Guru
Posts: 1114
Joined: Thu Jul 14, 2016 9:29 pm
Location: Belgium

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Sun Aug 22, 2021 10:00 am

Thanks both @msatter and @rextended for this revamp/update of the "generic lists downloader" able to pass the 64K boundary !
I'll give it a try for sure!
 
User avatar
jvanhambelgium
Forum Guru
Forum Guru
Posts: 1114
Joined: Thu Jul 14, 2016 9:29 pm
Location: Belgium

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Sun Aug 22, 2021 10:28 am

I've test against the Project Turris list and it obtained 9058 entries ? Did you get similar values ?
 
msatter
Forum Guru
Forum Guru
Posts: 2941
Joined: Tue Feb 18, 2014 12:56 am
Location: Netherlands / Nīderlande

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Sun Aug 22, 2021 12:05 pm

I've test against the Project Turris list and it obtained 9058 entries ? Did you get similar values ?
Yes, the first line of the file is ignored and the last 6 are IPv6 addresses. The script is still in the workings, however the first results are promising.

Update: Script is updated to avoid importing the incomplete first and last lines of a chunk that might be corrupt because of chunking.
 
profinet
just joined
Posts: 5
Joined: Mon Apr 23, 2018 1:17 pm

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Sun Aug 22, 2021 12:46 pm

Thank you @msatter. I posted this two scripts in another post, but i didn't know how combinated it.

It is possible added a delimiter option for import spamhaus database or another in one script?
 
msatter
Forum Guru
Forum Guru
Posts: 2941
Joined: Tue Feb 18, 2014 12:56 am
Location: Netherlands / Nīderlande

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Sun Aug 22, 2021 1:48 pm

Thank you @msatter. I posted this two scripts in another post, but i didn't know how combinated it.

It is possible added a delimiter option for import spamhaus database or another in one script?
Those use a range (example: /22) and then the RegEX has to be adapted/extended.

On second thought, I think it will work as is now.

viewtopic.php?f=9&t=152632&p=873984#p826157
Last edited by msatter on Sun Aug 22, 2021 2:20 pm, edited 1 time in total.
 
User avatar
rextended
Forum Guru
Forum Guru
Posts: 12558
Joined: Tue Feb 25, 2014 12:49 pm
Location: Italy
Contact:

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Sun Aug 22, 2021 2:09 pm

@msatter
Obviously some line are splitted on two between two parts like
\n69.100.54.120/19\n\
splitted on two, first read end with:
..............\n69.100.5
second read start with:
4.120/19\n\..............

The solution is simple, instead of read exactly the max 64KB (64512B) make part slightly less like 63K (63488B) to have room on variable to put previous "reminder" on top,
like if "reminder" is "69.100.5" and file part is "4.120/19\n\.............." with
:set varcontent "$reminder$filepart"
is obtained inside varcontent "69.100.54.120/19\n\.............."
and the IP is readable correctly.
 
User avatar
jvanhambelgium
Forum Guru
Forum Guru
Posts: 1114
Joined: Thu Jul 14, 2016 9:29 pm
Location: Belgium

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Sun Aug 22, 2021 8:16 pm

I've also changed/replaced the regex inside the script with the one @rextended explained earlier, this seems the most complete one.
I've managed to load/import all US-cidr onto a list, 61259 entries long just for testing. It took a loooong time on my RB3011 utilizing 1-core, but that is nothing new and already discussed earlier.
Doesn't really matter with these kinds of list that do not change often.

The more entries that added to the list, the slower it gets. I had the impression once you go above 30.000-40.000 adding few thousand entries get really, really slow.
So for reasonable small amount of entries (eg. 5000-15000) is looked pretty fast.
 
msatter
Forum Guru
Forum Guru
Posts: 2941
Joined: Tue Feb 18, 2014 12:56 am
Location: Netherlands / Nīderlande

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Sun Aug 22, 2021 8:45 pm

The RegEX is only there to detect if there is a valid IP adress on the line.

This version only allows one list and there are different versions that can do more. Even a version that allows you to provide login credentials if needed.

I made this Frankenstein version to see if it worked.

About huge lists. Importing those huge list should be not slow as long you first delete the old entries. To optimize stuff, a second array named array containing the IP adresses in the current list. We don't look in the address-list but instead in the array. This was made for lists up to 63KB but chunking that also now, to could optimize it. However not all lists are uniform and that makes chunking there flaky. Beter is to just not compare and always delete all dynamic entries before adding what alway the fasted method.

However it is already a long time ago I worked with others on this so I am I bit of touch on the actual working.

Update: for HUGE lists, variant 1 in OP
That was easy, three snips and the compare array was disabled, importing Turris was much much faster this way. Try with this version on a HUGE list and let me know?
Update 2:
I replaced the code with a more streamlined one and added some comments to make it easier to understand the workings. Remark, today Turris did it's weekly update and the number of lines is now 8327. I am taking this now in production on my own router and can do deactivate the cron which created the to imported list on a Linux system.
Update 3: added the option to use extra filtering like keywords in a RegEX like dns|sip (Hei rules) to only have those lines accepted out of the whole list. This is checked every line.
Update 3.5: small optimizations by not using variables if you only that one once and use a variable if that one is used several times.
Example:
$update url=https://project.turris.cz/greylist-data/greylist-latest.csv delimiter=, listname=turris timeout=8d heirule=dns|sip

I have updated the script and the new version can be found in this posting: viewtopic.php?p=935938#p935938
Last edited by msatter on Sun May 29, 2022 1:54 pm, edited 11 times in total.
 
User avatar
jvanhambelgium
Forum Guru
Forum Guru
Posts: 1114
Joined: Thu Jul 14, 2016 9:29 pm
Location: Belgium

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Sun Aug 22, 2021 9:44 pm

A rerun of this script on my (already existing) [test] access-list "us-blacklist" (61250 entries) only updates the dynamic timers (reset to 1d / 24:00:00) -> This whole operation whent very fast, less than 1 minute.

A complete erase + re-run of this version of the script is much,much faster. All +- 62k entries inserted into the list a "a few minutes" (< 5, didn't take out the stopwatch)
 
msatter
Forum Guru
Forum Guru
Posts: 2941
Joined: Tue Feb 18, 2014 12:56 am
Location: Netherlands / Nīderlande

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Sun Aug 22, 2021 11:21 pm

Ping
"Give me a ping, Vasili. One ping only, please."
 
User avatar
mozerd
Forum Veteran
Forum Veteran
Posts: 927
Joined: Thu Oct 05, 2017 3:39 pm
Location: Canada
Contact:

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Mon Aug 23, 2021 6:19 pm

Try with this version on a HUGE list and let me know?
Update 2:
I replaced the code with a more streamlined one and added some comments to make it easier to understand the workings. Remark, today Turris did it's weekly update and the number of lines is now 8327. I am taking this now in production on my own router and can do deactivate the cron which created the to imported list on a Linux system.
@msatter, excellent code ..... very very fast !!!
 
User avatar
rextended
Forum Guru
Forum Guru
Posts: 12558
Joined: Tue Feb 25, 2014 12:49 pm
Location: Italy
Contact:

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Mon Aug 23, 2021 6:24 pm

@msatter, excellent code ..... very very fast !!!
If it wasn't for me, for the methods of download big files (> 64K) @msatter would have nothing to work on...
viewtopic.php?f=9&t=177530
Last edited by rextended on Mon Aug 23, 2021 7:02 pm, edited 2 times in total.
 
User avatar
jvanhambelgium
Forum Guru
Forum Guru
Posts: 1114
Joined: Thu Jul 14, 2016 9:29 pm
Location: Belgium

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Mon Aug 23, 2021 6:47 pm

@msatter, excellent code ..... very very fast !!!
If it wasn't for me, for the methods of download big files (> 64K) @msatter would have nothing to work on...
viewtopic.php?f=9&t=177530
Nah don't worry about that. It's very obvious you have contributed the pivotal aspect of this approach/solution.
Never has anybody came up with this concept before to my knowledge, I've never seen it in any posting over the past years.
That was some clever problem solving! and you deserve full credit for this one.
 
User avatar
rextended
Forum Guru
Forum Guru
Posts: 12558
Joined: Tue Feb 25, 2014 12:49 pm
Location: Italy
Contact:

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Mon Aug 23, 2021 7:01 pm

Never has anybody came up with this concept before to my knowledge, I've never seen it in any posting over the past years.
That was some clever problem solving! and you deserve full credit for this one.
Remember, you and @jotne were my inspiration for that!
viewtopic.php?f=9&t=166293&p=872435#p872376
Last edited by rextended on Mon Aug 23, 2021 7:04 pm, edited 3 times in total.
 
User avatar
rextended
Forum Guru
Forum Guru
Posts: 12558
Joined: Tue Feb 25, 2014 12:49 pm
Location: Italy
Contact:

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Mon Aug 23, 2021 7:02 pm

I still do not like this method of importing list on address-list without any sanitization first, and the use of on-error also do not have any sense, like on delete.
Sooner or later or for some "on-error", or on purpose from website where the list is,
on address-list go 0.0.0.0/0 and block all, or a wrong prefix like 151.99.125.9/2 (instead of /24) block all from 128.x.x.x to 191.x.x.x
because 151.99.125.9/2 is imported on routeros like 128.0.0.0/2


viewtopic.php?f=9&t=166293#p872049
I'm already working on a method than use lists >64K and sanitizing what are imported, like:
[...]
4) Create whitelist, before add the IP / IP prefix check if it is on whitelist, then if is it, no add
5) Check on add if the ip-prefix is already present inside other IP-prefix already on address-list
6) Check on add if the ip-prefix is comprehensive of one or more IP-prefix on address-list, remove old(s) and add new bigger.
7) for security accept only from /12 to /32 prefix. /11 or less on IPv4 is too much big for be true...
8 ) Set an option for put the IP on the address-list but on temporary way (Dynamic) for specified time (from 1 second to near 35 weeks),
this do not export this type of IP on address-list on export or backup
whith this option set, if the address is found again on the imported list, instead to delete it and re-import, have time resetted again (from 1 second to near 35 weeks)
[...]
 
User avatar
jvanhambelgium
Forum Guru
Forum Guru
Posts: 1114
Joined: Thu Jul 14, 2016 9:29 pm
Location: Belgium

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Mon Aug 23, 2021 7:25 pm

Talking about sanitizing, I did stumble today on the fact that my "Turris" list was suddenly down to only +- 3000 entries!
This happened at the time the list performs its update in the morning. Very weird.
In the afternoon I completely flushed/erased the list and started the script manually and now its up with +- 8K entries.

Not too sure what happened there, and I'll be manually testing some more because such thing needs to be rock-solid and handle weirdness as it is ingested & processed, even it takes significantly longer.

Image


Hmm, I've start the script and did see 2 entries in the log fly by : Address list <> update failed
The list seems to maintain the amount of items then before I started the script, so nothing was flushed nor is the timestamp updated.
 
msatter
Forum Guru
Forum Guru
Posts: 2941
Joined: Tue Feb 18, 2014 12:56 am
Location: Netherlands / Nīderlande

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Mon Aug 23, 2021 8:48 pm

It is a weekly list so just update it at 06H30 am and polling it will only create mor load on their side. No wonder there is now a proxy in front. ;-)

Also nice is that you can select certain kinds of list. This could be a second selector beside the IP address presence.
Legend for current Hei rules
----------------------------

amplifiers           Easily exploitable services for amplification
broken_http          Broken inbound HTTP (known services)
cryptocoin           Cryptocoin miners
databases            Database servers
dns                  Incoming DNS queries
http_scan            HTTP/S scans
low_ports            Low ports (<1024)
netbios              NetBIOS
netis                Netis router exploit
ntp                  NTP
proxy_scan           Scans for HTTP/S and SOCKS proxies
remote_access        Remote access services (RDP, VNC, etc.)
samba                Samba (Windows shares)
sip                  SIP ports
ssdp                 SSDP
ssh                  SSH
synology             Synology NAS
telnet               Telnet
torrent              Common Torrent ports
Update to the script above: added the option to use extra filtering like keywords in a RegEX like dns|sip (Hei rules) to only have those lines accepted out of the whole list. This is checked every line. If this option is omitted then every line with an valid IP address is imported.

Example:
$update url=https://project.turris.cz/greylist-data/greylist-latest.csv delimiter=, listname=turris timeout=8d heirule=dns|sip
 
User avatar
jvanhambelgium
Forum Guru
Forum Guru
Posts: 1114
Joined: Thu Jul 14, 2016 9:29 pm
Location: Belgium

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Mon Aug 23, 2021 9:47 pm

Not really relevant I did pick up something weird. So I have the Turris list loaded and I'm also using this in the forward chain preventing any communication from inside my LAN towards any outside IP on that list.
Strangely enough I get some hits on this ;-(
The IP address 47.94.96.203 seems to go back to an IP on the list below, belonging to Alibaba Advertising Holding or something.

https://github.com/firehol/blocklist-ip ... ous.netset

The problem is that it seems to originate from my NAS, on which +15 docker containers are running, 3 VM's etc,etc.
At the moment its not very clear who initiates.
All my containers that are running are from trusted repo's etc. No funny stuff to my knowledge. (more things like influxdb,grafana,telegraf,watchtower,mosquitto etc)

The packet towards the Alibaba IP was "ICMP" , only a single instance.
The packet towards another Turris marked IP is also found in the abuseIP database.
This packet was dropped trying to creep out of my LAN coming from "something" on my NAS, source-port 6800 > dst-port tcp/48881

I've tried on my Synology using tcpdump on the "docker0" bridge instance so I see a lot of action of 172.17.x.x (internal) container traffic, but I could not capture anything trying to reach the above IP's...
Interesting .... a home network ;-)

https://www.abuseipdb.com/check/45.146. ... WjcnBszQil


Interesting observations ;-)
 
msatter
Forum Guru
Forum Guru
Posts: 2941
Joined: Tue Feb 18, 2014 12:56 am
Location: Netherlands / Nīderlande

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Tue Aug 24, 2021 2:09 am

The packet towards the Alibaba IP was "ICMP" , only a single instance.
The packet towards another Turris marked IP is also found in the abuseIP database.
This packet was dropped trying to creep out of my LAN coming from "something" on my NAS, source-port 6800 > dst-port tcp/48881

I've tried on my Synology using tcpdump on the "docker0" bridge instance so I see a lot of action of 172.17.x.x (internal) container traffic, but I could not capture anything trying to reach the above IP's...
Interesting .... a home network ;-)

Interesting observations ;-)
Do you have Solar-panels & inverter or/and using ModBus to access them? That port range is other used to discover inverters or other ModBus stuff, however normally UDP is used as broadcast.
 
User avatar
jvanhambelgium
Forum Guru
Forum Guru
Posts: 1114
Joined: Thu Jul 14, 2016 9:29 pm
Location: Belgium

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Tue Aug 24, 2021 9:13 pm

NOTE : The mystery on the ACL-hit got solved, it turned out to be old port-forwarding for torrent-traffic that hitted the NAS (torrent client not running) but the NAS effectively replied back that IP address in stead of playing silent. No worries.

However again today the update of the list did not go well,

At 2 PM, the script starts, suddenly loosing quite some entries.
The weird thing : it remains stable for a few hours until around 5PM and I notice the list was completely emptied and was also removed. Hence no more data after 6PM
So does the latest version of the script UPDATES the dynamic timer ? Or does it create from scratch all entries that are downloaded ?
Why the heck would it start throwing out entries hours after the download...

Image
Last edited by jvanhambelgium on Wed Aug 25, 2021 12:06 am, edited 1 time in total.
 
msatter
Forum Guru
Forum Guru
Posts: 2941
Joined: Tue Feb 18, 2014 12:56 am
Location: Netherlands / Nīderlande

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Tue Aug 24, 2021 10:13 pm

I have trouble following you on this.

If you read about the Greylist by Turris you will see that is updated once a week. They know that. If you keep hammering the proxy they might put you in SRC-IP adress jail. ;-)

BTW you are using an old verdion of the scipt that keeps the list active for one day. I use eigth days. Seven days is the refresh and one day spare. After seven days scheduler read the new list that not refreshed by Turris for an other seven days.

Update: you're using a version that does not remove old list and you could end up blocking addresses that are not on the list any more.
Last edited by msatter on Tue Aug 24, 2021 10:24 pm, edited 1 time in total.
 
User avatar
jvanhambelgium
Forum Guru
Forum Guru
Posts: 1114
Joined: Thu Jul 14, 2016 9:29 pm
Location: Belgium

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Tue Aug 24, 2021 10:23 pm

Ah, yeah you mentioned something like this earlier. I'll adapt the timeout to 1w of each of the entries and schedule it to run 1x / week
But nevertheless, I wonder why it behaves like I see ; the "drop" in entries I can understand if I receive only partial info from their end, but why the sudden drop to "0" entries several HOURS later.
That is the part that is not clear to me, as if the dynamic entries had a short lifespan....but the script issues 1d lifetime.

hmm, yeah, lets start issuing only 1x / week and we'll see...

Thx for pointing it out.
 
msatter
Forum Guru
Forum Guru
Posts: 2941
Joined: Tue Feb 18, 2014 12:56 am
Location: Netherlands / Nīderlande

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Tue Aug 24, 2021 10:32 pm

I have removed the first version of the script to avoid this happening to others. The first version was more a prove of concept that the Frankenstein, two parts joined, script worked.
 
msatter
Forum Guru
Forum Guru
Posts: 2941
Joined: Tue Feb 18, 2014 12:56 am
Location: Netherlands / Nīderlande

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Tue Aug 31, 2021 12:25 pm

I have added support for domain names beside IP addresses. Not tested yet but it should work.

In bold the changes an I hope the '+' is supported. Else it could be replaced by a '*'.
:if (( $line~"^[0-9]{1,3}\\.[0-9]{1,3}\\.[0-9]{1,3}\\.[0-9]{1,3}" || $line~"^.+\\.[a-z.]{2,7}" ) && $line~heirule) do={
This version is replaced by the version below. That one finds out the delimiter on it's own.
Last edited by msatter on Thu Sep 09, 2021 4:46 pm, edited 1 time in total.
 
User avatar
rextended
Forum Guru
Forum Guru
Posts: 12558
Joined: Tue Feb 25, 2014 12:49 pm
Location: Italy
Contact:

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Tue Aug 31, 2021 1:32 pm

search tag # rextended DNS RegEx

POSIX syntax:
^(([a-zA-Z0-9][a-zA-Z0-9-]{0,61}){0,1}[a-zA-Z]\.){1,9}[a-zA-Z][a-zA-Z0-9-]{0,28}[a-zA-Z]$

for MikroTik:
$line~"^(([a-zA-Z0-9][a-zA-Z0-9-]{0,61}){0,1}[a-zA-Z]\\.){1,9}[a-zA-Z][a-zA-Z0-9-]{0,28}[a-zA-Z]\$"

limited plausibly to 9 levels label+domain x9x.x8x.x7x.x6x.x5x.x4x.x3x.x2x.x1x.domain
and limited 30 characters for top domain (the longest actually existant is 24 characters XN--VERMGENSBERATUNG-PWB )

Rule for DNS names:
the format is label.domain or label2x.label1x.domain or label3x.label2x.label1x.domain etc. (ignoring never present on address list fqdn label.domain. )
max length for label and domain is 63 characters, but the longest domain today is 24 characters (XN--VERMGENSBERATUNG-PWB)
min length for label and domain are formerly 1 characters for label and 2 for domain
allowed characters for label and domain are case-insensitive a-z A-Z number 0-9 and the minus - (the _ are used for special cases, not for full domain name)
the first or the last (or the unique) character of label or domain can't be -
the last (or the unique) character of label or domain can't be a number
the first character of domain can't be a number
the first character of label can be a number but must be followed by at least one letter
the max lengt of the string must be 253 characters


EDIT: for DNS static RegEx that not support { } use
^(([a-zA-Z0-9][a-zA-Z0-9-]*)?[a-zA-Z]\.)+[a-zA-Z][a-zA-Z0-9-]*[a-zA-Z]$
Thanks to @kcarhc
Last edited by rextended on Thu Jul 13, 2023 2:37 am, edited 2 times in total.
 
msatter
Forum Guru
Forum Guru
Posts: 2941
Joined: Tue Feb 18, 2014 12:56 am
Location: Netherlands / Nīderlande

[Auto find delimiter] Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Thu Sep 09, 2021 4:44 pm

I have implemented the determination of the delimiter. This should make the import script more flexible and no need anymore to specify the different delimiters for each list.

A version that also recognizes what kind of format is used, IPv4/IPv4 with range/domain names/IPv6. A list can then only contain one type. Having to check every line will be doable but I assume the whole script would become very slow.

Update: added support for different kinds of lists. Supported are plain IPv4, IPv4 with range and domain names. I use a simple regEX for domain names.
Update 2: allows mixed lists when delimiter is set in the config line and it shows what the delimiter is. Output shows the kind of list is recognized.
:local R "[0-9]{1,3}"; # storing RegEX part in variable to have shorter strings in the code
:if ($sline ~ "^$R\\.$R\\.$R\\.$R")		do={:set $posix "^$R\\.$R\\.$R\\.$R";}
:if ($sline ~ "^$R\\.$R\\.$R\\.$R/[0-9]{1,2}")	do={:set $posix "^$R\\.$R\\.$R\\.$R/[0-9]{1,2}"}
:if ($sline ~ "^.+\\.[a-z.]{2,7}")		do={:set $posix "^.+\\.[a-z.]{2,7}"}
A strange thing was that I could not use :local in code above and had to resort to :set

When using a defined delimiter should allow to import mixed lists. This has still be implemented and can be done by using a regEX.
New version can be found in a later posting.
Last edited by msatter on Sat Sep 11, 2021 11:01 am, edited 7 times in total.
 
User avatar
rextended
Forum Guru
Forum Guru
Posts: 12558
Joined: Tue Feb 25, 2014 12:49 pm
Location: Italy
Contact:

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Thu Sep 09, 2021 5:08 pm

Well done and nice idea...

If you want, you can use those regex for determine what type of items the file containing: DNS, IP-Prefix or only IP
search for valid DNS
(([a-zA-Z0-9][a-zA-Z0-9-]{0,61}){0,1}[a-zA-Z]\\.){1,9}[a-zA-Z][a-zA-Z0-9-]{0,28}[a-zA-Z]

IP-Prefix: IP with mandatory subnet mask
((25[0-5]|(2[0-4]|[01]?[0-9]?)[0-9])\\.){3}(25[0-5]|(2[0-4]|[01]?[0-9]?)[0-9])\\/(3[0-2]|[0-2]?[0-9])

IP or IP-Prefix if present optional subnet mask
((25[0-5]|(2[0-4]|[01]?[0-9]?)[0-9])\\.){3}(25[0-5]|(2[0-4]|[01]?[0-9]?)[0-9])(\\/(3[0-2]|[0-2]?[0-9])){0,1}

IP without prefix
((25[0-5]|(2[0-4]|[01]?[0-9]?)[0-9])\\.){3}(25[0-5]|(2[0-4]|[01]?[0-9]?)[0-9])
 
msatter
Forum Guru
Forum Guru
Posts: 2941
Joined: Tue Feb 18, 2014 12:56 am
Location: Netherlands / Nīderlande

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Sat Sep 11, 2021 11:00 am

A new version and I think this is more or less complete now. I have also added a short explanation at the end what all the parameters do.

Update: During import is checked if the source file has changed in size and if so the import is retried after a 2 minutes wait. If no successfull import was possible you get now a specific message.
Next update: If an import failed in the end the list would be erased on forehand. Deleting is only now done on a successful import and this is possible because a all the current entries are renamed to a backup address-list. That backup address-list is removed on a successful import. I don't change the timeout time, so the entries could timeout before the next import. So keep an eye on the log if used.

New version of this script can be found in this posting: viewtopic.php?p=879181#p935938
Last edited by msatter on Sun May 29, 2022 2:06 pm, edited 11 times in total.
 
User avatar
jvanhambelgium
Forum Guru
Forum Guru
Posts: 1114
Joined: Thu Jul 14, 2016 9:29 pm
Location: Belgium

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Sat Sep 11, 2021 11:05 am

A big "Thank you!" towards all contributors!
 
msatter
Forum Guru
Forum Guru
Posts: 2941
Joined: Tue Feb 18, 2014 12:56 am
Location: Netherlands / Nīderlande

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Sat Sep 11, 2021 2:43 pm

A big "Thank you!" towards all contributors!
Especially to profinet who thought of combining the two scripts to create this "Frankenstein".
 
User avatar
rextended
Forum Guru
Forum Guru
Posts: 12558
Joined: Tue Feb 25, 2014 12:49 pm
Location: Italy
Contact:

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Sat Sep 11, 2021 2:52 pm

I slow my version because I want also manage fetch errors
(thanks for msatter for the idea of identify inside the type of list)
(I never see a msatter thanks to me for the method for download a file only one piece at time)

viewtopic.php?f=2&t=178355&p=878643#p878643

This is a work in progress code for manage the fetch errors, is ready to paste on terminal for testing purpose.
/file remove [find where name="testfetch.txt"]
{
    :local jobid [:execute file=testfetch.txt script="/tool fetch url=http://mikrotik.com"]
    :put "Waiting the end of process for file testfetch.txt to be ready, max 20 seconds..."
    :global Gltesec 0
    :while (([:len [/sys script job find where .id=$jobid]] = 1) && ($Gltesec < 20)) do={
        :set Gltesec ($Gltesec + 1)
        :delay 1s
        :put "waiting... $Gltesec"
    }
    :put "Done. Elapsed Seconds: $Gltesec\r\n"
    :if ([:len [/file find where name="testfetch.txt"]] = 1) do={
        :local filecontent [/file get [/file find where name="testfetch.txt"] contents]
        :put "Result of Fetch:\r\n****************************\r\n$filecontent\r\n****************************"
    } else={
        :put "File not created."
    }
}

on this case we obtain at the end "closing connection: <302 Found "https://mikrotik.com/"> 159.148.147.196:80 (4)"
because "http ://mikrotik.com" redirect to "https ://mikrotik.com/" (and redirect again to "https ://www.mikrotik.com/") (added spaces on purpose)
Last edited by rextended on Fri Feb 03, 2023 7:23 pm, edited 2 times in total.
 
kevinds
Long time Member
Long time Member
Posts: 657
Joined: Wed Jan 14, 2015 8:41 am

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Mon Apr 04, 2022 5:39 am


Example:
$update url=https://project.turris.cz/greylist-data/greylist-latest.csv delimiter=, listname=turris timeout=8d heirule=dns|sip
I may have missed the answer above, but does dns|sip mean the IP is added if 'dns' or 'sip' is in the line? Or does it mean that 'dns' and 'sip' need to be present for the IP to be added?

Overall, the 'final' version is really good. :) Thank you!
 
User avatar
Joni
Member Candidate
Member Candidate
Posts: 157
Joined: Fri Mar 20, 2015 2:46 pm
Contact:

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Mon Apr 04, 2022 10:50 am

.
"Thank you", the folliowing is just intended to lower the threshold of adoption and improve.

Some minor notes, this script:
  • doesn't have a distinguisable name, which would help finding it, something along the lines of "Shumkov msatter Blacklister" or something
  • has no script versioning in # comments (for clarity, typically a release date v2021.09.11)
  • the code doesn't contain a comment with origin source reference url (this thread) viewtopic.php?t=152632 so after a million copy pastes and ripoffs you could still find the original version
  • maybe more importantly it doesn't quote the heirule legend https://project.turris.cz/greylist-data/legend.txt so you could more easily scope your blocking options:
    • amplifiers Easily exploitable services for amplification
    • broken_http Broken inbound HTTP (known services)
    • cryptocoin Cryptocoin miners
    • databases Database servers
    • dns Incoming DNS queries
    • http_scan HTTP/S scans
    • low_ports Low ports (<1024)
    • netbios NetBIOS
    • netis Netis router exploit
    • ntp NTP
    • proxy_scan Scans for HTTP/S and SOCKS proxies
    • remote_access Remote access services (RDP, VNC, etc.)
    • samba Samba (Windows shares)
    • sip SIP ports
    • ssdp SSDP
    • ssh SSH
    • synology Synology NAS
    • telnet Telnet
    • torrent Common Torrent ports
  • especially since the last release post defaults to:
    • heirule=http which strictly isn't an official heirule, but less obviously matches heirules broken_http and http_scan
    • and uses nolog=1 which hides script progress making the adoption threshold higher
    • has no comment parameter like comment=turris-http so the address list and heirule relation would be more clear
    • as a more adoptable default I suggest something along the lines of
      }; # do
      $update url=https://project.turris.cz/greylist-data/greylist-latest.csv listname=turris comment=Turris-all timeout=8d
      }
  • there are no final instructions on use:
    • add to /system scripts
    • add /system scheduler for script
    • add firewall rule to actually block traffic according to the "turris" address list this scheduled script generates
      • /ip firewall filter add action=drop chain=input comment="Shumkov msatter Blacklister" interface=eth0 log=yes log-prefix=TURRIS src-address-list=turris
  • I challenge you to explain the origin of the name "hei" rule
 
kevinds
Long time Member
Long time Member
Posts: 657
Joined: Wed Jan 14, 2015 8:41 am

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Mon Apr 04, 2022 11:05 am

.I challenge you to explain the origin of the name "hei" rule
Something from the Czech language?
 
User avatar
Joni
Member Candidate
Member Candidate
Posts: 157
Joined: Fri Mar 20, 2015 2:46 pm
Contact:

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Mon Apr 04, 2022 11:11 am

.I challenge you to explain the origin of the name "hei" rule
Something from the Czech language?
I don't know.
But explain (find out), not speculate (guess) ;)
 
msatter
Forum Guru
Forum Guru
Posts: 2941
Joined: Tue Feb 18, 2014 12:56 am
Location: Netherlands / Nīderlande

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Mon Apr 04, 2022 12:36 pm

The reason I did not put my name in the script is because of the black cat that is sitting above in the thread.

For example SIP you have often a subscription and only the SIP provider IP address is connection in. If you only allow that IP to connect to your SIP then you won't need the SIP part of the address list. When you offer SIP service yourself and you don't know in advance from where you get your connections in then the address list helps.

Where and how you use the script is up to you and we did the work to find a good solution. It is combined knowledge from different people and work spread over years, not olnly in this thread. Mikrotik gives us a basic set of tools and combining those gives in the end a good result.
The scripts we publish is between { } so it can be run from terminal and when the result is not what you are suspected then press arrow up till your reach the to be changed line, and make your change and then press enter.

About the Heirrules and how it is used: viewtopic.php?t=152632#p874324

...and the black cat is as always, also underneath.
Last edited by msatter on Mon Apr 04, 2022 1:25 pm, edited 2 times in total.
 
User avatar
rextended
Forum Guru
Forum Guru
Posts: 12558
Joined: Tue Feb 25, 2014 12:49 pm
Location: Italy
Contact:

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Mon Apr 04, 2022 1:03 pm

:) 
 
Sob
Forum Guru
Forum Guru
Posts: 9188
Joined: Mon Apr 20, 2009 9:11 pm

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Mon Apr 04, 2022 5:01 pm

Something from the Czech language?
Nope, Chinese:

https://project.turris.cz/greylist-data/legend.txt (see the ascii art character at the beginning)
https://dictionary.hantrainerpro.com/ch ... _black.htm
 
kevinds
Long time Member
Long time Member
Posts: 657
Joined: Wed Jan 14, 2015 8:41 am

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Tue Apr 05, 2022 10:18 pm

Does the "Import was NOT successful!" error appear if there were no changes to the list, when using the noerase= option?
[k@a] > /system script run Advanced-Downloader 
Starting import of address-list: turris
Entries not conditional deleted in address-list: turris
List identified as a IPv4 list
Reading Part: 1 0 - 63999
Reading Part: 2 63488 - 127487
Reading Part: 3 126976 - 190975
Reading Part: 4 190464 - 254463
Reading Part: 5 253952 - 317951
Import was NOT successfull! Check if the list turris is still being maintained.
Restoring backup list: turris
[k@a] > 
Also, does the "Restoring backup list" need to happen if noerase= is set?


I changed the list to
 $update url=https://lists.blocklist.de/lists/all.txt listname=BlockList-DE timeout=1d noerase=1 
And get this output
/system script run BlockList-DE
Starting import of address-list: BlockList-DE
Entries not conditional deleted in address-list: BlockList-DE
List identified as a IPv4 list
Reading Part: 1 0 - 63999
Reading Part: 2 63488 - 127487
Reading Part: 3 126976 - 190975
Reading Part: 4 190464 - 254463
Reading Part: 5 253952 - 317951
Reading Part: 6 317440 - 381439
Completed reading 1 items into address-list BlockList-DE.
The one entry that gets added is 0.0.0.0, which isn't in the list.
 
kevinds
Long time Member
Long time Member
Posts: 657
Joined: Wed Jan 14, 2015 8:41 am

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Thu Apr 07, 2022 3:05 am

It is a weekly list so just update it at 06H30 am and polling it will only create mor load on their side.
According to the Turris support, it is updated daily.
 
kevinds
Long time Member
Long time Member
Posts: 657
Joined: Wed Jan 14, 2015 8:41 am

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Mon Apr 11, 2022 2:22 am


I changed the list to
 $update url=https://lists.blocklist.de/lists/all.txt listname=BlockList-DE timeout=1d noerase=1 
And get this output
/system script run BlockList-DE
Starting import of address-list: BlockList-DE
Entries not conditional deleted in address-list: BlockList-DE
List identified as a IPv4 list
Reading Part: 1 0 - 63999
Reading Part: 2 63488 - 127487
Reading Part: 3 126976 - 190975
Reading Part: 4 190464 - 254463
Reading Part: 5 253952 - 317951
Reading Part: 6 317440 - 381439
Completed reading 1 items into address-list BlockList-DE.
The one entry that gets added is 0.0.0.0, which isn't in the list.
Still having this issue...

If I use the original script, it imports around a quarter of it, stops in the IP addresses starting with 150. (64k?)

How can I (we) figure out where/why this doesn't import?
 
msatter
Forum Guru
Forum Guru
Posts: 2941
Joined: Tue Feb 18, 2014 12:56 am
Location: Netherlands / Nīderlande

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Mon Apr 11, 2022 9:01 pm

It is a weekly list so just update it at 06H30 am and polling it will only create mor load on their side.
According to the Turris support, it is updated daily.
I read this: https://project.turris.cz/en/greylist/

And look at the dates of the files: https://project.turris.cz/greylist-data/

Is Turris then support wrong on this?
------------------------------------------------------------------------------
Backup and restore, on failure leaves you with the original list in place.

-------------------------------------------------------------------------------
Had a first look and did not find a reason why it should not be imported so I will have a second look in time.

Second look was a success and the following line worked for me:
$update url=https://lists.blocklist.de/lists/all.txt listname=BlockList-DE delimiter=("\n") timeout=1d noerase=1
Result:
{... $update url=https://lists.blocklist.de/lists/all.txt listname=BlockList-DE delimiter=("\n") timeout=1d noerase=1
{... }
Starting import of address-list: BlockList-DE
Entries not conditional deleted in address-list: BlockList-DE
Using config-line defined delimiter: "
                                      "
Reading Part: 1 0 - 63999
Reading Part: 2 63488 - 127487
Reading Part: 3 126976 - 190975
Reading Part: 4 190464 - 254463
Reading Part: 5 253952 - 317951
Reading Part: 6 317440 - 381439
Completed reading 23347 items into address-list BlockList-DE.
Display of the delimiter is broken by the \n which is a NewLine
Got the error message earlier and will have at that too.
 
kevinds
Long time Member
Long time Member
Posts: 657
Joined: Wed Jan 14, 2015 8:41 am

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Sat Apr 23, 2022 9:00 am


I read this: https://project.turris.cz/en/greylist/

And look at the dates of the files: https://project.turris.cz/greylist-data/

Is Turris then support wrong on this?
https://view.sentinel.turris.cz/greylis ... hive/2022/

Shows a new file everyday :)
$update url=https://lists.blocklist.de/lists/all.txt listname=BlockList-DE delimiter=("\n") timeout=1d noerase=1
Cool. I was using 'delimiter' on the original versions.. I didn't realize it would work, or even be needed on the newer 'advanced' version.
Still only adding the 1 item though.. 0.0.0.0
{
/ip firewall address-list
:local update do={
 :put "Starting import of address-list: $listname"
 :if ($nolog = null) do={:log warning "Starting import of address-list: $listname"}

 :local maxretry 3
 :local retrywaitingtime 120s
 :local retryflag true
 :for retry from=1 to=$maxretry step=1 do={
  :if (retryflag) do={ :set $retryflag false; :set $sounter 0
  :if (retry > 1) do={
   :put "Source file changed. Retring after a $retrywaitingtime wait..."
   :if ($nolog = null) do={:log warning "Source file changed. Retring after a $retrywaitingtime wait..."}
   :delay $retrywaitingtime  }
  
 :local filesize ([/tool fetch url=$url src-address=192.0.2.1 user=$user password=$password keep-result=no as-value]->"total")
 :local start 0
 :local maxsize 64000;	        # reqeusted chunk size
 :local end ($maxsize - 1);	# because start is zero the maxsize has to be reduced by one
 :local partnumber	 ($filesize / ($maxsize / 1024)); # how many chunk are maxsize
 :local remainder	 ($filesize % ($maxsize / 1024)); # the last partly chunk 
 :if ($remainder > 0)    do={ :set $partnumber ($partnumber + 1) }; # total number of chunks
 :if ($heirule != null) do={:put "Using as extra filtering: $heirule"} else={:set $heirule "."}
 # remove the current list completely if "erase" is not present (default setting)
  :if ($noerase = null) do={  
   :if ($timeout = null) do={:set $timeout 00:00:00; :do {:foreach i in=[/ip firewall address-list find list=$listname] do={/ip firewall address-list set list=("backup".$listname) $i }} on-error={} } else={
   :do {:foreach i in=[/ip firewall address-list find list=$listname dynamic] do={/ip firewall address-list set list=("backup".$listname) $i }} on-error={} };                
   :put ("Conditional deleting all".$dynamic." entries in address-list: $listname")
   :if ($nolog = null) do={:log warning ("Conditional deleting all".$dynamic." entries in address-list: $listname")}
  } else={:put "Entries not conditional deleted in address-list: $listname"}; # ENDIF ERASE
 :for x from=1 to=$partnumber step=1 do={
   # get filesize to be compared to the orignal one and if changed then retry
   :local comparesize ([/tool fetch url=$url src-address=192.0.2.1 user=$user password=$password keep-result=no as-value]->"total")
   
#:set $comparesize 5 

   # fetching the chunks from the webserver when the size of the source file has not changed
   # empty array when the source file changed. No processing is done till the next complete retry
   :if ($comparesize = $filesize) do={:set $data ([:tool fetch url=$url src-address=192.0.2.1 user=$user password=$password http-header-field="Range: bytes=$start-$end" output=user as-value]->"data")} else={:set $data [:toarray ""]; :set $retryflag true}
     #:if ($ownposix = null) do={
  # determining the used delimiter in the list if not provided in the config
   # this only run once and so the impact on the import time is low
    :local ipv4Posix	  "^[0-9]{1,3}\\.[0-9]{1,3}\\.[0-9]{1,3}\\.[0-9]{1,3}"
    :local ipv4rangePosix "^[0-9]{1,3}\\.[0-9]{1,3}\\.[0-9]{1,3}\\.[0-9]{1,3}/[0-9]{1,2}"
    :local domainPosix	  "^.+\\.[a-z.]{2,7}"
    :local sdata $data;     
    :while ([:len $sdata]!=0 && $delimiter = null) do={ # The check on length of $sdata is for if no delimiter is found.
       	:local sline [:pick $sdata 0 [:find $sdata "\n"]]; :local slen [:len $sline];
       	# set posix depending of type of data used in the list
       	:if ($sline ~ $ipv4Posix)	do={:set $posix $ipv4Posix;	 :set $iden "List identified as a IPv4 list"}
       	:if ($sline ~ $ipv4rangePosix)	do={:set $posix $ipv4rangePosix; :set $iden "List identified as a IPv4 with ranges list"}
       	:if ($sline ~ $domainPosix)	do={:set $posix $domainPosix;	 :set $iden "List identified as a domain list"}
       	:if ($sline ~ $posix) do={:put $iden}
      	:if ($sline ~ $posix) do={ # only explore the line if there is match at the start of the line.
	 :do {:if ([:pick $sline 0 ($slen-$send)] ~ ($posix."\$")|| $send > $slen) do={:set $delimiter [:pick $sline ($slen-$send) ($slen-($send-1))]; :set $result true} else={:set $send ($send+1);} } while (!$result);
	}; #IF posix
	:set $sdata [:pick $sdata ([:find $sdata "\n"]+1) [:len $sdata]];
	:if ($delimiter != null) do={:local sdata [:toarray ""]}; #Clear array sdata and it is not needed anymore and triggering so the While to end
    }; #WHILE END $sdata
    :local sdata [:toarray ""] 
   #} else={:put "User defind Posix: $ownposix"; :set $posix $ownposix } ; # ENDIF ownposix = null   
   :if ($posix = null && $delimiter != null) do={:set $posix "."; :put "Using config-line defined delimiter: \"$delimiter\""}; # delimter provided by config line
   :if (!retryflag) do={:put "Reading Part: $x $start - $end"}   
   :if ($timeout = null) do={:local timeout 00:00:00}; # if no timeout is defined make it a static entry.    
   # Only remove the first line only if you are not at the start of list
   
   :if ($start > 0) do={:set $data [:pick $data ([:find $data "\n"]+1) [:len $data]]}
     :while ([:len $data]!=0) do={
       :local line [:pick $data 0 [:find $data "\n"]]; # create only once and checked twice as local variable
       :if ( $line ~ $posix && $line~heirule) do={    
        :do {add list=$listname address=[:pick $data 0 [:find $data $delimiter]] comment=$comment timeout=$timeout; :set $counter ($counter + 1)} on-error={}; # on error avoids any panics        
       }; # if IP address && extra filter if present
      :set $data [:pick $data ([:find $data "\n"]+1) [:len $data]]; # removes the just added IP from the data array
      # Cut of the end of the chunks by removing the last lines...very dirty but it works
      :if ([:len $data] < 256) do={:set $data [:toarray ""]}    
     }; # while

  :set $start (($start-512) + $maxsize); # shifts the subquential start back by 512  
  :set $end (($end-512) + $maxsize); # shift the subquential ends back by 512 to keep the 
  }; # if retryflag
 }; #do for x
 
}; # for retry
 :if ($counter < 1) do={:set $resultline "Import was NOT successfull! Check if the list $listname is still being maintained."} else={:set $resultline "Completed reading $counter items into address-list $listname." } 
 :put $resultline
 :if ($nolog = null) do={:log warning $resultline }
 :if ($counter > 0) do={:do {/ip firewall address-list remove [find where list=("backup".$listname)]} on-error={} } else={
 :do {:foreach i in=[/ip firewall address-list find list=("backup".$listname)] do={/ip firewall address-list set list=$listname $i }} on-error={}
 :put "Restoring backup list: $listname" 
 :if ($nolog = null) do={:log warning "Restoring backup list: $listname"}
 }; # if counter restore on failure and remove on success
}; # do
$update url=https://lists.blocklist.de/lists/all.txt listname=BlockList-DE user="anonymous" password="anonymous" timeout=1d noerase=1
}

# To be used configline settings:
# url=	        https://name.of.the.list
# listname=	name of address-list

# Optinal settings
# timeout=	the time the entry should be active. If omited then static entries are created.
# comment=	puts this comment on every line in the choosen address-list (default: no comment)
# heirule=	this will select on a word on each line if to import or not (default: no heirule)
# noerase=	any value, then the current list is not erased (default: erase)
# ownPosix=	allow to enter a onw regEX posix to be used (not ative at this moment)
# nolog=        any value, then don't write to the log (default: writing to log)
And the output..
system script run blocklist-de 
Starting import of address-list: BlockList-DE
Entries not conditional deleted in address-list: BlockList-DE
List identified as a IPv4 list
Reading Part: 1 0 - 63999
Reading Part: 2 63488 - 127487
Reading Part: 3 126976 - 190975
Reading Part: 4 190464 - 254463
Reading Part: 5 253952 - 317951
Completed reading 1 items into address-list BlockList-DE.
Humm.... Sometimes it does work.. Sometimes it doesn't.. Humm...
 
msatter
Forum Guru
Forum Guru
Posts: 2941
Joined: Tue Feb 18, 2014 12:56 am
Location: Netherlands / Nīderlande

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Sat Apr 23, 2022 12:04 pm

Last time I checked Greylist was only each week. You show a link to Sentinel and that is updated more frequently. It looks that the Sentinel file have replaced the Greylist files now.
I did not look into the files.

This a excellent page showing much near realtime data.

https://view.sentinel.turris.cz/?period=1w

The script could be adapted to not leave 0.0.0.0 in the file an -1 the counter to zero. Maybe then you will get a warning.

I think the \n is not found as a delimiter because it is here two characters.

Update:
When testing my default script I got this message: failure: closing connection: <301 Moved Permanently "https:// view.sentinel.turris .cz/greylist-data/"> 217.31.192.69:443 (5) so I have adapt the URL in my script. This because RouterOS is not following 301 redirects.
 
kevinds
Long time Member
Long time Member
Posts: 657
Joined: Wed Jan 14, 2015 8:41 am

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Mon Apr 25, 2022 6:01 am

When testing my default script I got this message: failure: closing connection: <301 Moved Permanently "https:// view.sentinel.turris .cz/greylist-data/"> 217.31.192.69:443 (5) so I have adapt the URL in my script. This because RouterOS is not following 301 redirects.
Yeah, that was where I found the historical data.. :)
 
msatter
Forum Guru
Forum Guru
Posts: 2941
Joined: Tue Feb 18, 2014 12:56 am
Location: Netherlands / Nīderlande

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Mon Apr 25, 2022 1:18 pm

So I was talking Greylist and you where talking Sentinel and so we where both right. ;-)
 
kevinds
Long time Member
Long time Member
Posts: 657
Joined: Wed Jan 14, 2015 8:41 am

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Sun May 15, 2022 9:51 pm

Can the 'type' be set on line with the URL too?
       	# set posix depending of type of data used in the list
       	:if ($sline ~ $ipv4Posix)	do={:set $posix $ipv4Posix;	 :set $iden "List identified as a IPv4 list"}
       	:if ($sline ~ $ipv4rangePosix)	do={:set $posix $ipv4rangePosix; :set $iden "List identified as a IPv4 with ranges list"}
       	:if ($sline ~ $domainPosix)	do={:set $posix $domainPosix;	 :set $iden "List identified as a domain list"}
       	:if ($sline ~ $posix) do={:put $iden}
Something like
$update url=https://lists.blocklist.de/lists/all.txt listname=BlockList-DE user="anonymous" password="anonymous" timeout=1d posix=ipv4Posix

##Because this line says not active at the moment
# ownPosix=	allow to enter a onw regEX posix to be used (not ative at this moment)
[code]

Or am I looking at the wrong place to find the variable that needs to be set?
 
msatter
Forum Guru
Forum Guru
Posts: 2941
Joined: Tue Feb 18, 2014 12:56 am
Location: Netherlands / Nīderlande

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Thu May 19, 2022 1:58 pm

When I remember it well I first implemented ownposix and then automatic detection superseded that. I kept the ownposix in there for an reason, I think that is what you want to use it for.

Re-activate this by removeing the # before :if($own...:
     #:if ($ownposix = null) do={
  # determining the used delimiter in the list if not provided in the config
And the same here } else=(:pu....:
    :local sdata [:toarray ""] 
   #} else={:put "User defind Posix: $ownposix"; :set $posix $ownposix } ; # ENDIF ownposix = null   

You have then to append a Posix string in the call line: ownposix="^[0-9]{1,3}\\.[0-9]{1,3}\\.[0-9]{1,3}\\.[0-9]{1,3}"
 
texmeshtexas
Member Candidate
Member Candidate
Posts: 151
Joined: Sat Oct 11, 2008 11:17 pm

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Sat May 28, 2022 7:42 pm

@Shumkov, @rextended and @smatter
You are all incredible. This is a fantastic workaround for the 64K file limit.
was actually able to load this very large list
$update url=https://iplists.firehol.org/files/block ... t_ua.ipset listname=blocklist_net_ua delimiter=("\n")
took a bit as the list has over 102K entries but within 5min in my RB4011 it was loaded.

Amazing!!!!!
Last edited by texmeshtexas on Sun May 29, 2022 7:46 pm, edited 1 time in total.
 
texmeshtexas
Member Candidate
Member Candidate
Posts: 151
Joined: Sat Oct 11, 2008 11:17 pm

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Sat May 28, 2022 8:04 pm

I'm seeing that lists like this one $update url=https://lists.blocklist.de/lists/all.txt listname=BlockList-DE delimiter=("\n") timeout=1d noerase=1
have some IPv6 entries.
would be nice is if the script put those entries on a corresponding IPv6 firewall address list. even same listname.
 
Simonej
Frequent Visitor
Frequent Visitor
Posts: 60
Joined: Sun Aug 22, 2021 3:34 am

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Sun May 29, 2022 12:13 pm

Hello!
Could be a good idea to integrate TOR Exits?
https://check.torproject.org/torbulkexitlist
https://www.dan.me.uk/torlist/
https://www.dan.me.uk/torlist/?exit

Tried to integrate with $update with no success, address were 1,2,3,4,5,6....
 
msatter
Forum Guru
Forum Guru
Posts: 2941
Joined: Tue Feb 18, 2014 12:56 am
Location: Netherlands / Nīderlande

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Sun May 29, 2022 12:57 pm

Try with adding delimiter=("\n") to the $update line
 
msatter
Forum Guru
Forum Guru
Posts: 2941
Joined: Tue Feb 18, 2014 12:56 am
Location: Netherlands / Nīderlande

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Sun May 29, 2022 1:48 pm

Due to problem with "\n" have to be set manually I have adapted the script to do this for your this when no delimiter has been found:
{
/ip firewall address-list
:local update do={
 :put "Starting import of address-list: $listname"
 :if ($nolog = null) do={:log warning "Starting import of address-list: $listname"}
 
 :local displayed true
 :local maxretry 3
 :local retrywaitingtime 120s
 :local retryflag true
 :for retry from=1 to=$maxretry step=1 do={
  :if (retryflag) do={ :set $retryflag false; :set $sounter 0
  :if (retry > 1) do={
   :put "Source file changed. Retring after a $retrywaitingtime wait..."
   :if ($nolog = null) do={:log warning "Source file changed. Retring after a $retrywaitingtime wait..."}
   :delay $retrywaitingtime  }
  
  :local fetchResult [/tool fetch url=$url keep-result=no as-value]
  :local filesize ($fetchResult->"total")
  :local downsize ($fetchResult->"downloaded") 
  :if ($filesize = 0 && $downsize > 0) do={ :set $filesize $downsize}

  :local start 0
  :local maxsize 64000;	        # reqeusted chunk size
  :local end ($maxsize - 1);	# because start is zero the maxsize has to be reduced by one
  :local partnumber	 ($filesize / ($maxsize / 1024)); # how many chunk are maxsize
  :local remainder	 ($filesize % ($maxsize / 1024)); # the last partly chunk 
  :if ($remainder > 0)    do={ :set $partnumber ($partnumber + 1) }; # total number of chunks
  :if ($heirule != null) do={:put "Using as extra filtering: $heirule"} else={:set $heirule "."}
 # remove the current list completely if "erase" is not present (default setting)
  :if ($noerase = null) do={  
   :if ($timeout = null) do={:set $timeout 00:00:00; :do {:foreach i in=[/ip firewall address-list find list=$listname] do={/ip firewall address-list set list=("backup".$listname) $i }} on-error={} } else={
   :do {:foreach i in=[/ip firewall address-list find list=$listname dynamic] do={/ip firewall address-list set list=("backup".$listname) $i }} on-error={} };                
   :put ("Conditional deleting all".$dynamic." entries in address-list: $listname")
   :if ($nolog = null) do={:log warning ("Conditional deleting all".$dynamic." entries in address-list: $listname")}
  } else={:put "Entries not conditional deleted in address-list: $listname"}; # ENDIF ERASE
 :for x from=1 to=$partnumber step=1 do={
   # get filesize to be compared to the orignal one and if changed then retry
   :local comparesize ([/tool fetch url=$url keep-result=no as-value]->"total")
   :if ($comparesize = 0 && $downsize > 0) do={ :set $comparesize $downsize}
   
   # fetching the chunks from the webserver when the size of the source file has not changed
   # empty array when the source file changed. No processing is done till the next complete retry
   :if ($comparesize = $filesize) do={:set $data ([:tool fetch url=$url http-header-field="Range: bytes=$start-$end" output=user as-value]->"data")} else={:set $data [:toarray ""]; :set $retryflag true}
     #:if ($ownposix = null) do={
  # determining the used delimiter in the list, when not provided in the config
   # this only run once and so the impact on the import time is low
    :local ipv4Posix	  "^[0-9]{1,3}\\.[0-9]{1,3}\\.[0-9]{1,3}\\.[0-9]{1,3}"
    :local ipv4rangePosix "^[0-9]{1,3}\\.[0-9]{1,3}\\.[0-9]{1,3}\\.[0-9]{1,3}/[0-9]{1,2}"
    :local domainPosix	  "^.+\\.[a-z.]{2,7}"
    :local sdata $data;
   # removes any lines at the top of the file that could interfere with finding the correct posix. Setting remarksign is needed
    :while ([:pick $sdata 0 1] = $remarksign) do={ :set $sdata [:pick $sdata ([:find $sdata "\n"]+1) [:len $sdata]] }    
    :while ([:len $sdata]!=0 && $delimiter = null) do={ # The check on length of $sdata is for if no delimiter is found.   
       	:local sline [:pick $sdata 0 [:find $sdata "\n"]]; :local slen [:len $sline];
       	# set posix depending of type of data used in the list
       	:if ($sline ~ $ipv4Posix)	    do={:set $posix $ipv4Posix;	     :set $iden "List identified as a IPv4 list"}
       	:if ($sline ~ $ipv4rangePosix)	do={:set $posix $ipv4rangePosix; :set $iden "List identified as a IPv4 with ranges list"}
       	:if ($sline ~ $domainPosix)	    do={:set $posix $domainPosix;	 :set $iden "List identified as a domain list"}
       	:if ($sline ~ $posix) do={:put $iden}
      	:if ($sline ~ $posix) do={ # only explore the line if there is a match at the start of the line.
	      :do {:if ([:pick $sline 0 ($slen-$send)] ~ ($posix."\$") || $send > $slen) do={
	        :set $delimiter [:pick $sline ($slen-$send) ($slen-($send-1))]; :set $result true} else={:set $send ($send+1)}  
             :if ($result) do={ :set  $extra [:pick $sline ($slen-$send) ($slen-($send-1))]
              :if ( $extra = " " )   do={ :set $delimiter [:pick $sline ($slen-$send) ($slen-($send-2))] }
              :if ( $extra = "  " )  do={ :set $delimiter [:pick $sline ($slen-$send) ($slen-($send-3))] }
              :if ( $extra = "   " ) do={ :set $delimiter [:pick $sline ($slen-$send) ($slen-($send-4))] }
             }; # EndIf result
	      } while (!$result); # EndDoWhile
	    }; #IF sline posix
	:set $sdata [:pick $sdata ([:find $sdata "\n"]+1) [:len $sdata]]; # cut off the already searched lines
	:if ($delimiter != null) do={:local sdata [:toarray ""]} ; #Clearing sdata array ending the WhileDo loop
    }; #WHILE END $sdata
    :local sdata [:toarray ""]
   :if ([:len $delimiter] = 0) do={ :set $delimiter "\n"; :set $delimiterShow "New Line" } else={ :set $delimiterShow $delimiter }; # when empty use NewLine 20220529	
   #} else={:put "User defind Posix: $ownposix"; :set $posix $ownposix } ; # ENDIF ownposix = null
   :if ($delimiter != null && $displayed ) do={:set $displayed false; :put "Using config provided delimiter: \"$delimiterShow\""}
   :if ($posix = null) do={:set $posix "."}; # Use a match all posix if nothing is defined or found 
   :if (!retryflag) do={:put "Reading Part: $x $start - $end"}   
   :if ($timeout = null) do={:local timeout 00:00:00}; # if no timeout is defined make it a static entry.    
   # Only remove the first line only if you are not at the start of list
   
:while ( [:pick $data 0 1] = $remarksign) do={ :set $data [:pick $data ([:find $data "\n"]+1) [:len $data]] }; # removes the invalid line (Spamhaus) 
   
   :if ($start > 0) do={:set $data [:pick $data ([:find $data "\n"]+1) [:len $data]]}
     :while ([:len $data]!=0) do={
       :local line [:pick $data 0 [:find $data "\n"]]; # create only once and checked twice as local variable
       :if ( $line ~ $posix && $line~heirule) do={    
        :do {add list=$listname address=[:pick $data 0 [:find $data $delimiter]] comment=$comment timeout=$timeout; :set $counter ($counter + 1)} on-error={}; # on error avoids any panics        
       }; # if IP address && extra filter if present
      :set $data [:pick $data ([:find $data "\n"]+1) [:len $data]]; # removes the just added IP from the data array
      # Cut of the end of the chunks by removing the last lines...very dirty but it works
      :if ([:len $data] < 256) do={:set $data [:toarray ""]}    
     }; # while

  :set $start (($start-512) + $maxsize); # shifts the subquential start back by 512  
  :set $end (($end-512) + $maxsize); # shift the subquential ends back by 512 to keep the 
  }; # if retryflag
 }; #do for x
 
}; # for retry
 :if ($counter < 1) do={:set $resultline "Import was NOT successfull! Check if the list $listname is still being maintained."} else={:set $resultline "Completed reading $counter items into address-list $listname." } 
 :put $resultline
 :if ($nolog = null) do={:log warning $resultline }
 :if ($counter > 0) do={:do {/ip firewall address-list remove [find where list=("backup".$listname)]} on-error={} } else={
 :do {:foreach i in=[/ip firewall address-list find list=("backup".$listname)] do={/ip firewall address-list set list=$listname $i }} on-error={}
 :put "Restoring backup list: $listname" 
 :if ($nolog = null) do={:log warning "Restoring backup list: $listname"}
 }; # if counter restore on failure and remove on success
}; # do
$update url=https://www.spamhaus.org/drop/drop.txt listname=spamhaus remarksign=";" timeout=1d nolog=1
$update url=https://lists.blocklist.de/lists/all.txt listname=blockDE timeout=1d nolog=1
}

# To be used configline settings:
# url=	        https://name.of.the.list
# listname=	name of address-list

# Optinal settings
# timeout=	the time the entry should be active. If omited then static entries are created.
# comment=	puts this comment on every line in the choosen address-list (default: no comment)
# heirule=	this will select on a word on each line if to import or not (default: no heirule)
# noerase=	any value, then the current list is not erased (default: erase)
# ownPosix=	allow to enter a onw regEX posix to be used (not ative at this moment)
# nolog=        any value, then don't write to the log (default: writing to log)
I removed the previous versions of this script to avoid any confusion.

Updated the textual part of the script so it states when "\n" NewLine is enforced.
Last edited by msatter on Sat Jun 04, 2022 10:53 pm, edited 4 times in total.
 
Simonej
Frequent Visitor
Frequent Visitor
Posts: 60
Joined: Sun Aug 22, 2021 3:34 am

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Mon May 30, 2022 1:02 am

AWESOME @msatter !
It's working with
delimiter=("\n")
seems to be mandatory also on your updated version of the script, correct?
 
kevinds
Long time Member
Long time Member
Posts: 657
Joined: Wed Jan 14, 2015 8:41 am

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Mon May 30, 2022 1:11 am

delimiter=("\n")
seems to be mandatory also on your updated version of the script, correct?
No, just sometimes the correct delimiter isn't detected properly.

Just like sometimes there is a URL about the list as the first line of the file and the script detects it as a list of FQDNs rather than IPs. My fix for this is to download it to my own server as a cron job and delete the first couple lines, then fetch it from my own server. I'm not advanced enough with my ROS scripting to have ROS do it, plus it keeps the scripts simplier.
 
msatter
Forum Guru
Forum Guru
Posts: 2941
Joined: Tue Feb 18, 2014 12:56 am
Location: Netherlands / Nīderlande

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Mon May 30, 2022 8:09 am

Did you try the updated script that uses "\n" when no delimiter was detected?

If a list has matching posix in the header then the script could be pick a wrong delimiter. Then you can over write that setting it manual.

How did you get on with posix yourself after reading my earlier answer to you on that?
 
msatter
Forum Guru
Forum Guru
Posts: 2941
Joined: Tue Feb 18, 2014 12:56 am
Location: Netherlands / Nīderlande

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Mon May 30, 2022 8:15 am

AWESOME @msatter !
It's working with
delimiter=("\n")
seems to be mandatory also on your updated version of the script, correct?
I only tested it on the first link you stated blocklist-DE. It is a very simple one, use NewLine when the length of the found delimiter is zero. Found delimiters, that are wrong, are not zero in length.

Update: updated later also the textual part of the script so it states when "\n" NewLine is enforced.
 
Simonej
Frequent Visitor
Frequent Visitor
Posts: 60
Joined: Sun Aug 22, 2021 3:34 am

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Wed Jun 01, 2022 9:53 pm

THANK YOU msatter for your AWESOME contribution!
 
Simonej
Frequent Visitor
Frequent Visitor
Posts: 60
Joined: Sun Aug 22, 2021 3:34 am

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Thu Jun 02, 2022 2:37 pm

Apologies for another dumb question, looking and found some great IP lists, Darklist.de, Greensnow.co and Snort are returning an error, any idea why?

Here are the ones found;
$update url=https://feodotracker.abuse.ch/downloads/ipblocklist.txt description/listname="Abuse.ch Feodo Tracker" delimiter=("\r")
$update url=https://sslbl.abuse.ch/blacklist/sslipblacklist.txt description/listname="Abuse.ch SSLBL" delimiter=("\r")
$update url=https://www.binarydefense.com/banlist.txt description/listname="Artillery Threat Intelligence Feed" delimiter=("\n")
$update url=https://lists.blocklist.de/lists/all.txt description/listname="BlockList.de - Fail2Ban" delimiter=("\n")
$update url=https://iplists.firehol.org/files/botscout.ipset description/listname="BotScout" delimiter=("\n")
$update url=https://cinsscore.com/list/ci-badguys.txt description/listname="CINS Army List" delimiter=("\n")
$update url=https://iplists.firehol.org/files/cleantalk.ipset description/listname="CleanTalk" delimiter=("\n")
$update url=https://iplists.firehol.org/files/cruzit_web_attacks.ipset description/listname="CruzIT" delimiter=("\n")
$update url=https://iplists.firehol.org/files/cybercrime.ipset description/listname="CyberCrime Tracker" delimiter=("\n")
$update url=https://iplists.firehol.org/files/darklist_de.netset description/listname="Darklist.de - Blacklisted" delimiter=("\n")
$update url=https://feeds.dshield.org/block.txt description/listname="DShield.org" delimiter=("\t") cidr=/24
$update url=https://iplists.firehol.org/files/greensnow.ipset description/listname="GreenSnow.co" delimiter=("\n")
$update url=https://myip.ms/files/blacklist/general/latest_blacklist.txt description/listname="MyIP.ms Blacklist" delimiter=("\n")
$update url=https://snort.org/downloads/ip-block-list description/listname="Snort - Talos IP Blacklist" delimiter=("\n")
$update url=https://www.spamhaus.org/drop/drop.txt description/listname="SpamHaus DROP" delimiter=("\_")
$update url=https://www.spamhaus.org/drop/edrop.txt description/listname="SpamHaus EDROP" delimiter=("\_")
$update url=https://stopforumspam.com/downloads/toxic_ip_cidr.txt description/listname="Stop Forum Spam" delimiter=("\n")
$update url=https://check.torproject.org/torbulkexitlist description/listname="Tor Exit List Service" delimiter=("\n")
$update url=https://iplists.firehol.org/files/voipbl.netset description/listname="VoIPBL.org" delimiter=("\n")
$update url=https://iplists.firehol.org/files/vxvault.ipset description/listname="VxVault" delimiter=("\n")
Last edited by Simonej on Tue Jun 07, 2022 11:41 am, edited 2 times in total.
 
kevinds
Long time Member
Long time Member
Posts: 657
Joined: Wed Jan 14, 2015 8:41 am

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Thu Jun 02, 2022 5:38 pm

Apologies for another dumb question, looking and found some great IP lists, Darklist.de, Greensnow.co and Snort are returning an error, any idea why?

Here are the ones found;
What error? You are missing some required variables.

Why are you adding so many IP lists? Especially ones that duplicate each other?
Last edited by kevinds on Thu Jun 02, 2022 6:05 pm, edited 1 time in total.
 
msatter
Forum Guru
Forum Guru
Posts: 2941
Joined: Tue Feb 18, 2014 12:56 am
Location: Netherlands / Nīderlande

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Thu Jun 02, 2022 5:43 pm

Have a look overhere: viewtopic.php?p=930372
 
Simonej
Frequent Visitor
Frequent Visitor
Posts: 60
Joined: Sun Aug 22, 2021 3:34 am

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Fri Jun 03, 2022 1:37 pm

@kevinds just an error on the log (for Darklist.de and Greensnow.co), there are more useful informations? I removed the other variables here but are present on the code :)
Adding more lists just for learning purposes, which one is duplicated?

@msatter you had the answer for "Snort" list, it's a redirect. You know an easy way to integrate the $url for a redirect?
 
msatter
Forum Guru
Forum Guru
Posts: 2941
Joined: Tue Feb 18, 2014 12:56 am
Location: Netherlands / Nīderlande

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Fri Jun 03, 2022 1:46 pm

No, following redirects is something that Mikrotik could implement in their code. Using JavaScript is something that is not feasible in RouterOS.

In the meantime you could an other device to download the files and the router is then downloading those from that device. That was the way I did it, before having this script.
 
User avatar
rextended
Forum Guru
Forum Guru
Posts: 12558
Joined: Tue Feb 25, 2014 12:49 pm
Location: Italy
Contact:

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Fri Jun 03, 2022 1:54 pm

I see is needed, I write the script.
Give me a few dozen minutes...
 
kevinds
Long time Member
Long time Member
Posts: 657
Joined: Wed Jan 14, 2015 8:41 am

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Fri Jun 03, 2022 1:58 pm

Adding more lists just for learning purposes, which one is duplicated?
Most of your blocklist.de entries. Not sure of your other lists though.
 
User avatar
rextended
Forum Guru
Forum Guru
Posts: 12558
Joined: Tue Feb 25, 2014 12:49 pm
Location: Italy
Contact:

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Fri Jun 03, 2022 2:20 pm

*** REMOVED ***

check new version on next post
 
Simonej
Frequent Visitor
Frequent Visitor
Posts: 60
Joined: Sun Aug 22, 2021 3:34 am

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Fri Jun 03, 2022 2:47 pm

@kevinds you're right... "all" Blocklist.de contains everything... 8)
@rextended is your solution easily implementable in msatter version? viewtopic.php?t=152632#p935938
Tried to integrate reading your post viewtopic.php?p=930372#p930458 without success.
 
User avatar
rextended
Forum Guru
Forum Guru
Posts: 12558
Joined: Tue Feb 25, 2014 12:49 pm
Location: Italy
Contact:

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Fri Jun 03, 2022 3:11 pm

*** REMOVED ***

check new version on next post
 
User avatar
rextended
Forum Guru
Forum Guru
Posts: 12558
Joined: Tue Feb 25, 2014 12:49 pm
Location: Italy
Contact:

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Fri Jun 03, 2022 7:29 pm

search tag # rextended checkurl

A new advanced version of the function for check the URL

Check if the URL is valid and read server response and redirects
:global checkurl do={
    /file remove [find where name~"checkurl.(txt|tmp)"]
    {
        :local jobid [:execute file=checkurl.txt \
            script="/tool fetch http-header-field=\"Range: bytes=0-0\" dst-path=\"checkurl.tmp\" url=\"$1\""]
        :local testsec 0
        :while (([:len [/sys script job find where .id=$jobid]] = 1) && ($testsec < 20)) do={
            :set testsec ($testsec + 1)
            :delay 1s
        }
        :local error { cod="000.0" ; txt="NO CODE"}
        :if ([:len [/file find where name="checkurl.txt"]] = 1) do={
            :local check [/file get [/file find where name="checkurl.txt"] contents]
            /file remove [find where name~"checkurl.(txt|tmp)"]
            # 200 URL OK
            :if ($check~"status: finished") do={
                :set ($error->"cod") "200"
                :set ($error->"txt") "OK"
                :return $error
            }
            # 301 Permanent Redirect
            :if ($check~" <301 Moved Permanently ") do={
                :set ($error->"cod") "301"
                :set ($error->"txt") [:pick $check ([:find $check " <301 Moved Permanently \"" -1] + 25) [:find $check "\"> " -1]]
                :return $error
            }
            # 302 Redirect
            :if ($check~" <302 Found ") do={
                :set ($error->"cod") "302"
                :set ($error->"txt") [:pick $check ([:find $check " <302 Found \"" -1] + 13) [:find $check "\"> " -1]]
                :return $error
            }
            # other Codes (error or not)
            :if ($check~" <.*> ") do={
                :set ($error->"txt") [:pick $check ([:find $check " <" -1] + 2) [:find $check "> " -1]]
                :set ($error->"cod") [:pick ($error->"txt") 0 [:find ($error->"txt") " " -1]]
                :set ($error->"txt") [:pick ($error->"txt") ([:find ($error->"txt") " " -1] + 1) [:len ($error->"txt")]]
                :return $error
            }
            # MikroTik fetch specific errors
            :if ($check~"failure: ") do={
                :set ($error->"cod") "666.1"
                :set ($error->"txt") [:pick $check ([:find $check "failure: " -1] + 9) [:len $check]]
                :return $error
            }
            # unexpected results
            :set ($error->"cod") "000"
            :set ($error->"txt") $check
            :return $error
        } else={
            # :execute unsuccessful or timeout
            :set error { cod="666.0" ; txt="TEMP FILE ERROR"}
            /file remove [find where name~"checkurl.(txt|tmp)"]
            :return $error
        }
    }
}

Syntax: call $checkurl and you have back one array with return code and text.
# 301 permanent redirect example
:put [$checkurl "http://forum.mikrotik.com"]
cod=301;txt=https://forum.mikrotik.com/

# 302 CDN redirect example:
:put [$checkurl "h ttps://snort.org/downloads/ip-block-list"]
cod=302;txt=https://snort-org-site.s3.amazonaws.com/.../ip_filter.blf?X-Amz-Algorithm=...

# 404 not found example
:put [$checkurl "https://forum.mikrotik.com/not-exist"]
cod=404;txt=Not Found

How to use on download lists:
{
    :local url        "https://snort.org/downloads/ip-block-list"
    :local filename   "ip_filter.blf"

    :local testresult [$checkurl $url]
    :local returncode  ($testresult->"cod")
    :local returntext  ($testresult->"txt")
    :if ($returncode = "200") do={
        # can be downloaded directly
        $update url=$url
    } else={
        :if ($returncode = "302") do={
            # use redirected URL
            $update url=$returntext
        } else={
            # some error happen
            :log error "Error checking $url: $returncode $returntext"
        }
    }
}
Last edited by rextended on Fri Feb 03, 2023 7:23 pm, edited 2 times in total.
 
msatter
Forum Guru
Forum Guru
Posts: 2941
Joined: Tue Feb 18, 2014 12:56 am
Location: Netherlands / Nīderlande

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Fri Jun 03, 2022 9:00 pm

To integrate checking for a redirect I tested it quick and dirty with these adaptations:

The import script changes:
{
/ip firewall address-list
:local update do={
:global checkurl
:if ($url ~ "invalid URL protocol") do={:log error "Could not import $listname due to a problem with the stated URL"} else={

 :put "Starting import of address-list: $listname"
 .
 .
 .
 #$update url=https://view.sentinel.turris.cz/greylist-data/greylist-latest.csv delimiter=, listname=turris timeout=8d heirule=http nolog=1
$update url=[$checkurl https://view.sentinel.turris.cz/greylist-data/greylist-latest.csv] delimiter=, listname=turris timeout=8d heirule=http nolog=1
 }
And slimmed the Global function by only returning one result:
:global checkurl do={
    /file remove [find where name~"checkurl.(txt|tmp)"]
    {
        :local jobid [:execute file=checkurl.txt \
            script="/tool fetch http-header-field=\"Range: bytes=0-0\" dst-path=\"checkurl.tmp\" url=\"$1\""]
        :local testsec 0
        :while (([:len [/sys script job find where .id=$jobid]] = 1) && ($testsec < 20)) do={
            :set testsec ($testsec + 1)
            :delay 1s
        }
        :local error { cod="000.0" ; txt="NO CODE"}
        :if ([:len [/file find where name="checkurl.txt"]] = 1) do={
            :local check [/file get [/file find where name="checkurl.txt"] contents]
            /file remove [find where name~"checkurl.(txt|tmp)"]
            # 200 URL OK
            :if ($check~"status: finished") do={
                :set $error $1
                :return $error
            }
            # 301 Permanent Redirect
            :if ($check~" <301 Moved Permanently ") do={

                :set $error [:pick $check ([:find $check " <301 Moved Permanently \"" -1] + 25) [:find $check "\"> " -1]]
                :return $error
            }
            # 302 Redirect
            :if ($check~" <302 Found ") do={

                :set $error [:pick $check ([:find $check " <302 Found \"" -1] + 13) [:find $check "\"> " -1]]
                :return $error
            }
            # other Codes (error or not)
            :if ($check~" <.*> ") do={
                :set ($error->"txt") [:pick $check ([:find $check " <" -1] + 2) [:find $check "> " -1]]
                :set ($error->"cod") [:pick ($error->"txt") 0 [:find ($error->"txt") " " -1]]
                :set ($error->"txt") [:pick ($error->"txt") ([:find ($error->"txt") " " -1] + 1) [:len ($error->"txt")]]
                :return $error
            }
            # MikroTik fetch specific errors
            :if ($check~"failure: ") do={
  
                :set $error [:pick $check ([:find $check "failure: " -1] + 9) [:len $check]]
                :return $error
            }
            # unexpected results

            :set $error $check
            :return $error
        } else={
            # :execute unsuccessful or timeout
            :set error { cod="666.0" ; txt="TEMP FILE ERROR"}
            /file remove [find where name~"checkurl.(txt|tmp)"]
            :return $error
        }
    }
}
This can be streamlined store the exact error to a Global variable that is read in the import script and then used in the log.

Thanks to rextended for the coding.
 
User avatar
rextended
Forum Guru
Forum Guru
Posts: 12558
Joined: Tue Feb 25, 2014 12:49 pm
Location: Italy
Contact:

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Fri Jun 03, 2022 9:18 pm

Grazie :-o
 
texmeshtexas
Member Candidate
Member Candidate
Posts: 151
Joined: Sat Oct 11, 2008 11:17 pm

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Sat Jun 04, 2022 1:07 am

any idea why the script does not like
$update url=http://www.spamhaus.org/drop/drop.txt
Tried delimiter " ;" and ";" and variations to include "\n"
 
User avatar
jvanhambelgium
Forum Guru
Forum Guru
Posts: 1114
Joined: Thu Jul 14, 2016 9:29 pm
Location: Belgium

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Sat Jun 04, 2022 8:44 am

any idea why the script does not like
$update url=http://www.spamhaus.org/drop/drop.txt
Tried delimiter " ;" and ";" and variations to include "\n"
When I press that link I get : Could not connect: (and no further output)
So I could understand why the script does not like such response.


jvanham@cruncher:~$ wget http://www.spamhaus.org/drop/drop.txt
--2022-06-04 07:43:28-- http://www.spamhaus.org/drop/drop.txt
Resolving www.spamhaus.org (www.spamhaus.org)... 104.16.198.238, 104.16.199.238
Connecting to www.spamhaus.org (www.spamhaus.org)|104.16.198.238|:80... connected.
HTTP request sent, awaiting response... 200 OK
Length: 19 [text/html]
Saving to: ‘drop.txt’

drop.txt 100%[==========================================================>] 19 --.-KB/s in 0s

2022-06-04 07:43:28 (2,01 MB/s) - ‘drop.txt’ saved [19/19]

jvanham@cruncher:~$ more drop.txt
Could not connect:
jvanham@Cruncher:~$
 
msatter
Forum Guru
Forum Guru
Posts: 2941
Joined: Tue Feb 18, 2014 12:56 am
Location: Netherlands / Nīderlande

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Sat Jun 04, 2022 12:26 pm

I see that the total: is zero and then nothing is imported in the script. The download shows 25KiB so there must be something going wrong. I am currently on RouterOS 7.2RC6
  status: connecting
      status: finished
  downloaded: 25KiB
       total: 0KiB
    duration: 1s
Why there is a Could not connect: in a direct download is cause by that redirects to and HTTPS. This will be possible in a next version of this script as you can read above.

When in doubt test test the URL in a browser and look at URL in the browser and if content is being showed.

First success with edited script:
{... $update url=https://www.spamhaus.org/drop/drop.txt delimiter=" ;" listname=spamhaus timeout=1d nolog=1

Starting import of address-list: spamhaus
Conditional deleting all entries in address-list: spamhaus
Using config-line defined delimiter: " ;"
Reading Part: 1 0 - 63999
Completed reading 931 items into address-list spamhaus.
Using " ;" as delimiter, space and then a ;

Posted the updated script here: viewtopic.php?p=935938#p935938

You will have to add an space before the semicolon and enclose it with " ......and don't forget to change http to https here.

Updated script: Now up to three spaces are detected before the delimiter. Also introcuced an new variable named remarksign. This is needed for lists that use the same character for delimiter and for a start of a comment line.
To find the correct posix the lines which start with the remark/delimter have to removed from from the data begore the correct one can be found.

An other update: cleaned and optimized the script. Also avoiding to download the same file twice.
 
texmeshtexas
Member Candidate
Member Candidate
Posts: 151
Joined: Sat Oct 11, 2008 11:17 pm

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Sun Jun 05, 2022 6:55 pm

that did the trick. Thanks for the adjustment @msatter !!
 
Simonej
Frequent Visitor
Frequent Visitor
Posts: 60
Joined: Sun Aug 22, 2021 3:34 am

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Mon Jun 06, 2022 1:34 pm

Apologies for asking again, looking to integrate your suggestions for redirect, without success.
In terminal with :put ([$checkurl "https://snort.org/downloads/ip-block-list"]) I was able to read the correct url, any hint on how to easely integrate, with $update url= ?
Probably not useful but testing :put ([:tool fetch url=[$checkurl "https://snort.org/downloads/ip-block-list"] output=user as-value]->"data") I got all the data, in terminal.
Thanks
 
msatter
Forum Guru
Forum Guru
Posts: 2941
Joined: Tue Feb 18, 2014 12:56 am
Location: Netherlands / Nīderlande

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Mon Jun 06, 2022 2:44 pm

You missed adding :global checkurl this because you have to define a global variable before you can use it in a script.
 
texmeshtexas
Member Candidate
Member Candidate
Posts: 151
Joined: Sat Oct 11, 2008 11:17 pm

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Tue Jun 07, 2022 5:11 am

after putting the :global checkurl code at the top of the main script, I use this to download everything

:global testurl [$checkurl url=https://snort.org/downloads/ip-block-list]; $update url=[:pick $testurl 1] listname="Snort"

seems to work just fine.
 
texmeshtexas
Member Candidate
Member Candidate
Posts: 151
Joined: Sat Oct 11, 2008 11:17 pm

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Tue Jun 07, 2022 5:29 am

Tried to download greensnow

$checkurl url=https://blocklist.greensnow.co/greensnow.txt
cod=666.1;txt=invalid URL protocol

its not a redirect and also fails the basic download via $update
failure: closing connection: <error processing HTTP response> 85.236.154.77:443 (4)

loads on a browser ok.
Any ideas?
 
User avatar
rextended
Forum Guru
Forum Guru
Posts: 12558
Joined: Tue Feb 25, 2014 12:49 pm
Location: Italy
Contact:

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Tue Jun 07, 2022 11:23 am

url= ??? is not on checkurl parameters
 
Simonej
Frequent Visitor
Frequent Visitor
Posts: 60
Joined: Sun Aug 22, 2021 3:34 am

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Tue Jun 07, 2022 11:42 am

Can't help, still fighting with [$checkur url=https://... but I also had the same error with Greensnow, you can find my updated list viewtopic.php?p=936764#p936764
Last edited by Simonej on Tue Jun 07, 2022 12:02 pm, edited 2 times in total.
 
User avatar
rextended
Forum Guru
Forum Guru
Posts: 12558
Joined: Tue Feb 25, 2014 12:49 pm
Location: Italy
Contact:

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Tue Jun 07, 2022 11:52 am

url= ??? is not on checkurl parameters
 
msatter
Forum Guru
Forum Guru
Posts: 2941
Joined: Tue Feb 18, 2014 12:56 am
Location: Netherlands / Nīderlande

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Tue Jun 07, 2022 11:55 am

My posting by me states:
$update url=[$checkurl https://view.sentinel.turris.cz/greylist-data/greylist-latest.csv] delimiter=, listname=turris timeout=8d heirule=http nolog=1
So $update url=[$checkurl htt... and not $update [$checkurl url=htt..

Update: first test with using the function checkurl:
{... $update url=https://snort.org/downloads/ip-block-list listname=snort timeout=1h
{... $update url=https://project.turris.cz/greylist-data/greylist-latest.csv listname=turris delimiter=, heirrule=http timeout=8d
{... $update url=https://view.sentinel.turris.cz/greylist-data/greylist-latest.csv listname=turris delimiter=, heirrule=http timeout=8d
{... $update url=htt ps://lists.blocklist.de/lists/all.txt listname=blockDE timeout=1h nolog=1                                         
{... $update url=http://www.spamhaus.org/drop/drop.txt listname=spamhaus delimiter=";" timeout=1h nolog=1
{... }
There was a problem downloading snort and the list has been ignored!

There was a problem downloading turris and the list has been ignored!

Starting import of address-list: turris
Conditional deleting all entries in address-list: turris
List identified as a IPv4 list
Using delimiter: ","
Reading Part: 1 0 - 63999
Reading Part: 2 63488 - 127487
Reading Part: 3 126976 - 190975
Completed reading 5659 items into address-list turris.

There was a problem downloading blockDE and the list has been ignored!

Starting import of address-list: spamhaus
Conditional deleting all entries in address-list: spamhaus
List identified as a IPv4 with ranges list
Using delimiter: " ;"
Reading Part: 1 0 - 63999
Completed reading 931 items into address-list spamhaus. 
## Same config but with checkurl function active ##
Starting import of address-list: snort
Conditional deleting all entries in address-list: snort
List identified as a IPv4 list
Using delimiter: "New Line"
Reading Part: 1 0 - 63999
Completed reading 783 items into address-list snort.

Checking URL...Problem (code): 301 - https://view.sentinel.turris.cz/greylist-data/
address-list turris is not imported. Check log more information

Starting import of address-list: turris
Conditional deleting all entries in address-list: turris
List identified as a IPv4 list
Using delimiter: ","
Reading Part: 1 0 - 63999
Reading Part: 2 63488 - 127487
Reading Part: 3 126976 - 190975
Completed reading 5659 items into address-list turris.

Checking URL...Problem (code): 666.1 - invalid URL protocol

address-list blockDE is not imported. Check log more information

Starting import of address-list: spamhaus
Conditional deleting all entries in address-list: spamhaus
List identified as a IPv4 with ranges list
Using delimiter: " ;"
Reading Part: 1 0 - 63999
Completed reading 931 items into address-list spamhaus.
 
msatter
Forum Guru
Forum Guru
Posts: 2941
Joined: Tue Feb 18, 2014 12:56 am
Location: Netherlands / Nīderlande

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Tue Jun 07, 2022 3:22 pm

This then a testing version of the list downloader with support for the :global $checkurl function: The function I put underneath the script and has to be executed so it gets stored in :global. The script runs also without $checkurl . I have adapted the function so that it does not use the files system when there is a straight download (code 200):
{removed
}
checkurl with no usage of filesystem when it is a straight download
{removed
}
I looked also at Greensnow but could not find a cause why RouterOS have trouble with it. As soon as I use output=user then it gives an error.
 
texmeshtexas
Member Candidate
Member Candidate
Posts: 151
Joined: Sat Oct 11, 2008 11:17 pm

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Fri Jun 10, 2022 5:43 am

In addition to importing IP lists, I'm looking for a way to import a domain list, like
https://threatview.io/Downloads/DOMAIN- ... e-Feed.txt

I've had pretty good luck using the domain in this type of filtering scheme
In this case, I block 0000.com.my

/ip firewall raw
remove [find comment="Malicious_domain"]
add action=drop chain=prerouting protocol=tcp dst-port=80,443 content=0000.com.my tls-host=*0000.com.my place-before=0 comment="Malicious_domain"

anyone seen a script to do this sort of import?
Last edited by texmeshtexas on Thu Jun 16, 2022 12:24 am, edited 2 times in total.
 
Simonej
Frequent Visitor
Frequent Visitor
Posts: 60
Joined: Sun Aug 22, 2021 3:34 am

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Fri Jun 10, 2022 12:50 pm

Hello @msatter, still trying to understand your and @rextended suggestions for $checkurl (already tried several times without success, I feel soo dump...), may I ask why you removed
content?
Thank you
 
msatter
Forum Guru
Forum Guru
Posts: 2941
Joined: Tue Feb 18, 2014 12:56 am
Location: Netherlands / Nīderlande

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Fri Jun 10, 2022 1:21 pm

@Simonej: I explained the misunderstanding here: viewtopic.php?p=938735#p937764 You need to check the sequencing of the items in that line.

On the removed being showed, the cause is what is currently going on in the forum and the reaction from Mikrotik and/or moderators and the communication of Mikrotik to their users. Having a personal cool-down period and thinking about if this is still a healthy situation.
 
Simonej
Frequent Visitor
Frequent Visitor
Posts: 60
Joined: Sun Aug 22, 2021 3:34 am

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Fri Jun 10, 2022 2:16 pm

An huge THANKS from all the readers, your contribution is precious.
Wish you all the best.

PS: Always used $update url=[$checkurl as indicated, not other ways.
 
texmeshtexas
Member Candidate
Member Candidate
Posts: 151
Joined: Sat Oct 11, 2008 11:17 pm

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Wed Jun 15, 2022 1:59 am

I looked also at Greensnow but could not find a cause why RouterOS have trouble with it. As soon as I use output=user then it gives an error.
greensnow is fully included in firehol level2 anyway

$update url=https://iplists.firehol.org/files/firehol_level2.netset listname=firehol_level2 delimiter=("\n") timeout=90d
 
DarkNate
Forum Guru
Forum Guru
Posts: 1065
Joined: Fri Jun 26, 2020 4:37 pm

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Wed Jun 15, 2022 11:53 pm

So which version of the script here is most effective and doesn't impact disk read/write? Also, what about IPv6?
 
texmeshtexas
Member Candidate
Member Candidate
Posts: 151
Joined: Sat Oct 11, 2008 11:17 pm

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Thu Jun 16, 2022 8:07 am

anyone know why this list fails
$update url=https://feodotracker.abuse.ch/downloads ... mended.txt listname=FeodoC2 delimiter=("\n") timeout=90d
 
texmeshtexas
Member Candidate
Member Candidate
Posts: 151
Joined: Sat Oct 11, 2008 11:17 pm

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Thu Jun 16, 2022 9:08 am

I'm guessing its because some sites are encoded with gzip. That is the case with that Feodo site.
 
User avatar
rextended
Forum Guru
Forum Guru
Posts: 12558
Joined: Tue Feb 25, 2014 12:49 pm
Location: Italy
Contact:

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Thu Jun 16, 2022 11:11 am

do not guess, fail because the delimiter is \r\n not only \n
open file with binary editor for see that

the correct syntax is
delimiter="\r\n"
 
texmeshtexas
Member Candidate
Member Candidate
Posts: 151
Joined: Sat Oct 11, 2008 11:17 pm

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Fri Jun 17, 2022 11:46 pm

do not guess, fail because the delimiter is \r\n not only \n
open file with binary editor for see that

the correct syntax is
delimiter="\r\n"
Sure enough, works with \r\n

thanks.
 
eXtremer
Frequent Visitor
Frequent Visitor
Posts: 95
Joined: Fri Nov 26, 2010 10:33 am

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Fri Oct 07, 2022 10:00 am

Can smb help me with DShield, it doesn't work as supposed to. It adds "Start" address and the IP's are without /24

Thank you.
$update url=https://www.dshield.org/block.txt listname=DShield delimiter=("\t") cidr=/24 timeout=1d nolog=1
Image
 
texmeshtexas
Member Candidate
Member Candidate
Posts: 151
Joined: Sat Oct 11, 2008 11:17 pm

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Fri Oct 07, 2022 4:44 pm

that list is not in CIDR format and will not work on this download script.

Firehol also has a copy of dshield in their format
see if that works
https://iplists.firehol.org/files/dshield.netset

I tried this but it only downloads the first 3 entries for some reason.
$update url=https://iplists.firehol.org/files/dshield.netset listname=dshield delimiter=("\n") timeout=90d
 
eXtremer
Frequent Visitor
Frequent Visitor
Posts: 95
Joined: Fri Nov 26, 2010 10:33 am

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Mon Oct 10, 2022 9:35 am

moderator note: do not quote preceding post, use "Post Reply"
Exactly, same thing in my case, only 3 entries without any error. Is there a solution?
 
texmeshtexas
Member Candidate
Member Candidate
Posts: 151
Joined: Sat Oct 11, 2008 11:17 pm

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Mon Oct 10, 2022 4:46 pm

good question for @smatter or @rextended
 
DarkNate
Forum Guru
Forum Guru
Posts: 1065
Joined: Fri Jun 26, 2020 4:37 pm

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Sat Oct 29, 2022 2:03 am

Anyone knows how to make OP script work for IPv6 address lists as well?
 
User avatar
rextended
Forum Guru
Forum Guru
Posts: 12558
Joined: Tue Feb 25, 2014 12:49 pm
Location: Italy
Contact:

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Sat Oct 29, 2022 11:27 am

Just replace ip with ipv6 and must be created a valid RegEx to identify IPv6 address for replace ipv4 RegEx.
the regex must cover all cases for example ::, ffff::, xxx::192.168.0.1, xxxx:xxxx::xx:x:1, xxx:xx:xxxx::x:x:x/xxx etc etc etc
 
DarkNate
Forum Guru
Forum Guru
Posts: 1065
Joined: Fri Jun 26, 2020 4:37 pm

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Sun Oct 30, 2022 12:47 pm

Just replace ip with ipv6 and must be created a valid RegEx to identify IPv6 address for replace ipv4 RegEx.
the regex must cover all cases for example ::, ffff::, xxx::192.168.0.1, xxxx:xxxx::xx:x:1, xxx:xx:xxxx::x:x:x/xxx etc etc etc
I got the regex, but not sure how to fit it in the script.
(([0-9a-fA-F]{1,4}:){7,7}[0-9a-fA-F]{1,4}|([0-9a-fA-F]{1,4}:){1,7}:|([0-9a-fA-F]{1,4}:){1,6}:[0-9a-fA-F]{1,4}|([0-9a-fA-F]{1,4}:){1,5}(:[0-9a-fA-F]{1,4}){1,2}|([0-9a-fA-F]{1,4}:){1,4}(:[0-9a-fA-F]{1,4}){1,3}|([0-9a-fA-F]{1,4}:){1,3}(:[0-9a-fA-F]{1,4}){1,4}|([0-9a-fA-F]{1,4}:){1,2}(:[0-9a-fA-F]{1,4}){1,5}|[0-9a-fA-F]{1,4}:((:[0-9a-fA-F]{1,4}){1,6})|:((:[0-9a-fA-F]{1,4}){1,7}|:)|fe80:(:[0-9a-fA-F]{0,4}){0,4}%[0-9a-zA-Z]{1,}|::(ffff(:0{1,4}){0,1}:){0,1}((25[0-5]|(2[0-4]|1{0,1}[0-9]){0,1}[0-9])\.){3,3}(25[0-5]|(2[0-4]|1{0,1}[0-9]){0,1}[0-9])|([0-9a-fA-F]{1,4}:){1,4}:((25[0-5]|(2[0-4]|1{0,1}[0-9]){0,1}[0-9])\.){3,3}(25[0-5]|(2[0-4]|1{0,1}[0-9]){0,1}[0-9]))
 
User avatar
diamuxin
Member
Member
Posts: 340
Joined: Thu Sep 09, 2021 5:46 pm

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Thu Nov 10, 2022 6:01 pm

Hello,

I am trying to import the following list and it is impossible, where can the problem come from?
$update url=http://test.sytes.net/BlacklistIPSEC.txt description="IPSEC" delimiter=("\n")
Any help?
BR.

{
/ip firewall address-list
:local update do={
    :do {
    :local data ([:tool fetch url=$url output=user as-value]->"data")
    :local array [find dynamic list=blacklist]
    :foreach value in=$array do={:set array (array,[get $value address])}
    :while ([:len $data]!=0) do={
        :if ([:pick $data 0 [:find $data "\n"]]~"^[0-9]{1,3}\\.[0-9]{1,3}\\.[0-9]{1,3}\\.[0-9]{1,3}") do={
            :local ip ([:pick $data 0 [:find $data $delimiter]].$cidr)
            :do {add list=blacklist address=$ip comment=$description timeout=1d nolog=1} on-error={
                :do {set ($array->([:find $array $ip]-[:len $array]/2)) timeout=1d} on-error={}
                }
        }
        :set data [:pick $data ([:find $data "\n"]+1) [:len $data]]
        :log info "Address list <$description> successfully updated"
        }
    } on-error={:log warning "Address list <$description> update failed"}
}
$update url=http://test.sytes.net/BlacklistIPSEC.txt description="IPSEC" delimiter=("\n")
}
Last edited by diamuxin on Wed Mar 20, 2024 3:36 pm, edited 2 times in total.
 
User avatar
rextended
Forum Guru
Forum Guru
Posts: 12558
Joined: Tue Feb 25, 2014 12:49 pm
Location: Italy
Contact:

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Thu Nov 10, 2022 6:24 pm

is not \n but \r for that source
open file with binary editor and you can see end line as "0x0D 0x0A" (\r\n) [carriage Return / New line]
 
User avatar
diamuxin
Member
Member
Posts: 340
Joined: Thu Sep 09, 2021 5:46 pm

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Thu Nov 10, 2022 7:42 pm

is not \n but \r for that source
open file with binary editor and you can see end line as "0x0D 0x0A" (\r\n) [carriage Return / New line]
Ok, I understood you.

I have installed the Hex Editor plugin in Visual Studio Code and I have seen in the file the hexadecimal position of the end of line, you are right (a new thing I have learned, thanks).

But the problem is that I have tried the delimiter as ("\r"), ("\n"), ("\r\n") and it doesn't import the address-list. I don't understand where is the problem.
$update url=http://gines.sytes.net/BlacklistIPSEC.txt description="IPSEC" delimiter=("\r")
BR.
 
msatter
Forum Guru
Forum Guru
Posts: 2941
Joined: Tue Feb 18, 2014 12:56 am
Location: Netherlands / Nīderlande

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Thu Nov 10, 2022 11:14 pm

Try without defining a delimiter. So omitting it.
}
        :set data [:pick $data ([:find $data "\n"]+1) [:len $data]]
        :log info "Address list <$description> successfully updated"
        }
 
User avatar
diamuxin
Member
Member
Posts: 340
Joined: Thu Sep 09, 2021 5:46 pm

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Thu Nov 10, 2022 11:55 pm

Try without defining a delimiter. So omitting it.
Thanks for replying, but it doesn't work.
Like this?
{
/ip firewall address-list
:local update do={
    :do {
        :local data ([:tool fetch url=$url output=user as-value]->"data")
        remove [find list=blacklist comment=$description]
        :while ([:len $data]!=0) do={
            :if ([:pick $data 0 [:find $data "\n"]]~"^[0-9]{1,3}\\.[0-9]{1,3}\\.[0-9]{1,3}\\.[0-9]{1,3}") do={
                :do {add list=blacklist address=([:pick $data 0 [:find $data $delimiter]].$cidr) comment=$description timeout=1d} on-error={}
            }
            :set data [:pick $data ([:find $data "\n"]+1) [:len $data]]
            :log info "Address list <$description> successfully updated"
        }
    } on-error={:log warning "Address list <$description> update failed"}
}
$update url=http://test.sytes.net/BlacklistIPSEC.txt description=IPSec 
}
BR.
Last edited by diamuxin on Wed Mar 20, 2024 3:39 pm, edited 1 time in total.
 
User avatar
diamuxin
Member
Member
Posts: 340
Joined: Thu Sep 09, 2021 5:46 pm

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Fri Nov 11, 2022 1:03 am

I have found that this is how it works OK:
$update url=http://192.168.88.225/BlacklistIPSEC.txt description=IPSec delimiter=("\r")
But with a NO-IP type address it does not work:
$update url=http://test.sytes.net/BlacklistIPSEC.txt description=IPSec delimiter=("\r")
I will continue to investigate...
Thanks to all.
Last edited by diamuxin on Wed Mar 20, 2024 4:25 pm, edited 1 time in total.
 
User avatar
rextended
Forum Guru
Forum Guru
Posts: 12558
Joined: Tue Feb 25, 2014 12:49 pm
Location: Italy
Contact:

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Fri Nov 11, 2022 3:23 pm

check if the device solve the DNS....
:put [:resolve "gines.sytes.net"]
 
User avatar
diamuxin
Member
Member
Posts: 340
Joined: Thu Sep 09, 2021 5:46 pm

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Fri Nov 11, 2022 4:29 pm

check if the device solve the DNS....
Yes, resolve your IP
:put [:resolve "gines.sytes.net"]
79.116.10.76
Edit:

I have tested that in a free hosting, if it works normally. The problem seems to be in the DDNS type domains.
:put ([:tool fetch url=http://gidh.atwebpages.com/BlacklistIPSEC.txt output=user as-value]->"data")

Thx & BR.
Last edited by diamuxin on Fri Nov 11, 2022 4:33 pm, edited 1 time in total.
 
User avatar
rextended
Forum Guru
Forum Guru
Posts: 12558
Joined: Tue Feb 25, 2014 12:49 pm
Location: Italy
Contact:

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Fri Nov 11, 2022 4:32 pm

Chech the permissions if executed on script or scheduler...
 
User avatar
diamuxin
Member
Member
Posts: 340
Joined: Thu Sep 09, 2021 5:46 pm

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Fri Nov 11, 2022 4:39 pm

Chech the permissions if executed on script or scheduler...
Image

BR.
 
User avatar
rextended
Forum Guru
Forum Guru
Posts: 12558
Joined: Tue Feb 25, 2014 12:49 pm
Location: Italy
Contact:

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Fri Nov 11, 2022 4:44 pm

Sorry, but I can't imagine any guess as to why it doesn't work with DNS instead of IP ...
 
User avatar
diamuxin
Member
Member
Posts: 340
Joined: Thu Sep 09, 2021 5:46 pm

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Fri Nov 11, 2022 4:57 pm

Sorry, but I can't imagine any guess as to why it doesn't work with DNS instead of IP ...
No problem, thanks anyway..

BR.
 
DarkNate
Forum Guru
Forum Guru
Posts: 1065
Joined: Fri Jun 26, 2020 4:37 pm

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Sat Nov 19, 2022 1:43 am

I got the regex, but not sure how to fit it in the script.
(([0-9a-fA-F]{1,4}:){7,7}[0-9a-fA-F]{1,4}|([0-9a-fA-F]{1,4}:){1,7}:|([0-9a-fA-F]{1,4}:){1,6}:[0-9a-fA-F]{1,4}|([0-9a-fA-F]{1,4}:){1,5}(:[0-9a-fA-F]{1,4}){1,2}|([0-9a-fA-F]{1,4}:){1,4}(:[0-9a-fA-F]{1,4}){1,3}|([0-9a-fA-F]{1,4}:){1,3}(:[0-9a-fA-F]{1,4}){1,4}|([0-9a-fA-F]{1,4}:){1,2}(:[0-9a-fA-F]{1,4}){1,5}|[0-9a-fA-F]{1,4}:((:[0-9a-fA-F]{1,4}){1,6})|:((:[0-9a-fA-F]{1,4}){1,7}|:)|fe80:(:[0-9a-fA-F]{0,4}){0,4}%[0-9a-zA-Z]{1,}|::(ffff(:0{1,4}){0,1}:){0,1}((25[0-5]|(2[0-4]|1{0,1}[0-9]){0,1}[0-9])\.){3,3}(25[0-5]|(2[0-4]|1{0,1}[0-9]){0,1}[0-9])|([0-9a-fA-F]{1,4}:){1,4}:((25[0-5]|(2[0-4]|1{0,1}[0-9]){0,1}[0-9])\.){3,3}(25[0-5]|(2[0-4]|1{0,1}[0-9]){0,1}[0-9]))
@rextended any ideas on how to get this working for IPv6?
 
User avatar
rextended
Forum Guru
Forum Guru
Posts: 12558
Joined: Tue Feb 25, 2014 12:49 pm
Location: Italy
Contact:

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Sat Nov 19, 2022 2:41 am

Since it's not my script, it just uses my method to download the multipart file, everything else I haven't checked.
Let's say it might work for ipv6 if you change these things:

from
:local update do={
to
:local updatev6 do={

replace all 8 occurrencies of
/ip
with
/ipv6

replace all 8 occurrencies of
ipv4
with
ipv6

replace the posix
^[0-9]{1,3}\\.[0-9]{1,3}\\.[0-9]{1,3}\\.[0-9]{1,3}
with what you found

ignore the others posix or fix your to include also /xxx for IPv6


But I can not check if is working or not.

Sooner or later I must write a correct posix for IPv6 like what already I do for IPv4
viewtopic.php?p=938735#p871284
 
DarkNate
Forum Guru
Forum Guru
Posts: 1065
Joined: Fri Jun 26, 2020 4:37 pm

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Sat Nov 19, 2022 2:45 pm

Since it's not my script, it just uses my method to download the multipart file, everything else I haven't checked.
Let's say it might work for ipv6 if you change these things:

from
:local update do={
to
:local updatev6 do={

replace all 8 occurrencies of
/ip
with
/ipv6

replace all 8 occurrencies of
ipv4
with
ipv6

replace the posix
^[0-9]{1,3}\\.[0-9]{1,3}\\.[0-9]{1,3}\\.[0-9]{1,3}
with what you found

ignore the others posix or fix your to include also /xxx for IPv6


But I can not check if is working or not.

Sooner or later I must write a correct posix for IPv6 like what already I do for IPv4
viewtopic.php?p=938735#p871284
This isn't working nor outputting anything:
delay 60
ipv6 firewall address-list
:local update do={
:do {
:local data ([:tool fetch url=$url output=user as-value]->"data")
:local array [find dynamic list=blacklist]
:foreach value in=$array do={:set array (array,[get $value address])}
:while ([:len $data]!=0) do={
:if ([:pick $data 0 [:find $data "\n"]]~"(([0-9a-fA-F]{1,4}:){7,7}[0-9a-fA-F]{1,4}|([0-9a-fA-F]{1,4}:){1,7}:|([0-9a-fA-F]{1,4}:){1,6}:[0-9a-fA-F]{1,4}|([0-9a-fA-F]{1,4}:){1,5}(:[0-9a-fA-F]{1,4}){1,2}|([0-9a-fA-F]{1,4}:){1,4}(:[0-9a-fA-F]{1,4}){1,3}|([0-9a-fA-F]{1,4}:){1,3}(:[0-9a-fA-F]{1,4}){1,4}|([0-9a-fA-F]{1,4}:){1,2}(:[0-9a-fA-F]{1,4}){1,5}|[0-9a-fA-F]{1,4}:((:[0-9a-fA-F]{1,4}){1,6})|:((:[0-9a-fA-F]{1,4}){1,7}|:)|fe80:(:[0-9a-fA-F]{0,4}){0,4}%[0-9a-zA-Z]{1,}|::(ffff(:0{1,4}){0,1}:){0,1}((25[0-5]|(2[0-4]|1{0,1}[0-9]){0,1}[0-9])\.){3,3}(25[0-5]|(2[0-4]|1{0,1}[0-9]){0,1}[0-9])|([0-9a-fA-F]{1,4}:){1,4}:((25[0-5]|(2[0-4]|1{0,1}[0-9]){0,1}[0-9])\.){3,3}(25[0-5]|(2[0-4]|1{0,1}[0-9]){0,1}[0-9]))") do={
:local ipv6 ([:pick $data 0 [:find $data $delimiter]].$cidr)
:do {add list=blacklist address=$ipv6 comment=$description timeout=1d} on-error={
:do {set ($array->([:find $array $ipv6]-[:len $array]/2)) timeout=1d} on-error={}
}
}
:set data [:pick $data ([:find $data "\n"]+1) [:len $data]]
}
} on-error={:log warning "Address list <$description> update failed"}
}
$update url=https://www.dshield.org/block.txt description=DShield delimiter=("\t") cidr=/24
$update url=https://www.spamhaus.org/drop/drop.txt description="Spamhaus DROP" delimiter=("\_")
$update url=https://www.spamhaus.org/drop/edrop.txt description="Spamhaus EDROP" delimiter=("\_")
$update url=https://sslbl.abuse.ch/blacklist/sslipblacklist.txt description="Abuse.ch SSLBL" delimiter=("\r")
 
User avatar
rextended
Forum Guru
Forum Guru
Posts: 12558
Joined: Tue Feb 25, 2014 12:49 pm
Location: Italy
Contact:

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Sat Nov 19, 2022 8:54 pm

Didn't you get a little suspicion that we were talking about two different scripts?
 
User avatar
diamuxin
Member
Member
Posts: 340
Joined: Thu Sep 09, 2021 5:46 pm

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Tue Dec 13, 2022 8:04 pm

Hello,

Something has happened in the DSHIELD list that the script does not work, where can be the error?
:delay 5s
/ip firewall address-list
:local update do={
    :do {
    :local data ([:tool fetch url=$url output=user as-value]->"data")
    :local array [find dynamic list=blacklist]
    :foreach value in=$array do={:set array (array,[get $value address])}
    :while ([:len $data]!=0) do={
        :if ([:pick $data 0 [:find $data "\n"]]~"^[0-9]{1,3}\\.[0-9]{1,3}\\.[0-9]{1,3}\\.[0-9]{1,3}") do={
            :local ip ([:pick $data 0 [:find $data $delimiter]].$cidr)
            :do {add list=blacklist address=$ip comment=$description timeout=1d} on-error={
                :do {set ($array->([:find $array $ip]-[:len $array]/2)) timeout=1d} on-error={}
                }
        }
        :set data [:pick $data ([:find $data "\n"]+1) [:len $data]]
        :log info "Address list <$description> successfully updated"
        }
    } on-error={:log warning "Address list <$description> update failed"}
}

$update url=http://feeds.dshield.org/block.txt description=DShield delimiter=("\t") cidr=/24
BR.
 
User avatar
rextended
Forum Guru
Forum Guru
Posts: 12558
Joined: Tue Feb 25, 2014 12:49 pm
Location: Italy
Contact:

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Tue Dec 13, 2022 8:44 pm

now accept only https, fetch do not support the auto-redirect from http to https, simply add the s after http
 
User avatar
diamuxin
Member
Member
Posts: 340
Joined: Thu Sep 09, 2021 5:46 pm

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Tue Dec 13, 2022 8:57 pm

Perfect! thanks.

BR.
 
vota
just joined
Posts: 1
Joined: Mon Apr 17, 2023 7:14 pm

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Mon Apr 17, 2023 7:21 pm

Due to problem with "\n" have to be set manually I have adapted the script to do this for your this when no delimiter has been found:
{
/ip firewall address-list
:local update do={
 :put "Starting import of address-list: $listname"
 :if ($nolog = null) do={:log warning "Starting import of address-list: $listname"}
 
 :local displayed true
 :local maxretry 3
 :local retrywaitingtime 120s
 :local retryflag true
 :for retry from=1 to=$maxretry step=1 do={
  :if (retryflag) do={ :set $retryflag false; :set $sounter 0
  :if (retry > 1) do={
   :put "Source file changed. Retring after a $retrywaitingtime wait..."
   :if ($nolog = null) do={:log warning "Source file changed. Retring after a $retrywaitingtime wait..."}
   :delay $retrywaitingtime  }
  
  :local fetchResult [/tool fetch url=$url keep-result=no as-value]
  :local filesize ($fetchResult->"total")
  :local downsize ($fetchResult->"downloaded") 
  :if ($filesize = 0 && $downsize > 0) do={ :set $filesize $downsize}

  :local start 0
  :local maxsize 64000;	        # reqeusted chunk size
  :local end ($maxsize - 1);	# because start is zero the maxsize has to be reduced by one
  :local partnumber	 ($filesize / ($maxsize / 1024)); # how many chunk are maxsize
  :local remainder	 ($filesize % ($maxsize / 1024)); # the last partly chunk 
  :if ($remainder > 0)    do={ :set $partnumber ($partnumber + 1) }; # total number of chunks
  :if ($heirule != null) do={:put "Using as extra filtering: $heirule"} else={:set $heirule "."}
 # remove the current list completely if "erase" is not present (default setting)
  :if ($noerase = null) do={  
   :if ($timeout = null) do={:set $timeout 00:00:00; :do {:foreach i in=[/ip firewall address-list find list=$listname] do={/ip firewall address-list set list=("backup".$listname) $i }} on-error={} } else={
   :do {:foreach i in=[/ip firewall address-list find list=$listname dynamic] do={/ip firewall address-list set list=("backup".$listname) $i }} on-error={} };                
   :put ("Conditional deleting all".$dynamic." entries in address-list: $listname")
   :if ($nolog = null) do={:log warning ("Conditional deleting all".$dynamic." entries in address-list: $listname")}
  } else={:put "Entries not conditional deleted in address-list: $listname"}; # ENDIF ERASE
 :for x from=1 to=$partnumber step=1 do={
   # get filesize to be compared to the orignal one and if changed then retry
   :local comparesize ([/tool fetch url=$url keep-result=no as-value]->"total")
   :if ($comparesize = 0 && $downsize > 0) do={ :set $comparesize $downsize}
   
   # fetching the chunks from the webserver when the size of the source file has not changed
   # empty array when the source file changed. No processing is done till the next complete retry
   :if ($comparesize = $filesize) do={:set $data ([:tool fetch url=$url http-header-field="Range: bytes=$start-$end" output=user as-value]->"data")} else={:set $data [:toarray ""]; :set $retryflag true}
     #:if ($ownposix = null) do={
  # determining the used delimiter in the list, when not provided in the config
   # this only run once and so the impact on the import time is low
    :local ipv4Posix	  "^[0-9]{1,3}\\.[0-9]{1,3}\\.[0-9]{1,3}\\.[0-9]{1,3}"
    :local ipv4rangePosix "^[0-9]{1,3}\\.[0-9]{1,3}\\.[0-9]{1,3}\\.[0-9]{1,3}/[0-9]{1,2}"
    :local domainPosix	  "^.+\\.[a-z.]{2,7}"
    :local sdata $data;
   # removes any lines at the top of the file that could interfere with finding the correct posix. Setting remarksign is needed
    :while ([:pick $sdata 0 1] = $remarksign) do={ :set $sdata [:pick $sdata ([:find $sdata "\n"]+1) [:len $sdata]] }    
    :while ([:len $sdata]!=0 && $delimiter = null) do={ # The check on length of $sdata is for if no delimiter is found.   
       	:local sline [:pick $sdata 0 [:find $sdata "\n"]]; :local slen [:len $sline];
       	# set posix depending of type of data used in the list
       	:if ($sline ~ $ipv4Posix)	    do={:set $posix $ipv4Posix;	     :set $iden "List identified as a IPv4 list"}
       	:if ($sline ~ $ipv4rangePosix)	do={:set $posix $ipv4rangePosix; :set $iden "List identified as a IPv4 with ranges list"}
       	:if ($sline ~ $domainPosix)	    do={:set $posix $domainPosix;	 :set $iden "List identified as a domain list"}
       	:if ($sline ~ $posix) do={:put $iden}
      	:if ($sline ~ $posix) do={ # only explore the line if there is a match at the start of the line.
	      :do {:if ([:pick $sline 0 ($slen-$send)] ~ ($posix."\$") || $send > $slen) do={
	        :set $delimiter [:pick $sline ($slen-$send) ($slen-($send-1))]; :set $result true} else={:set $send ($send+1)}  
             :if ($result) do={ :set  $extra [:pick $sline ($slen-$send) ($slen-($send-1))]
              :if ( $extra = " " )   do={ :set $delimiter [:pick $sline ($slen-$send) ($slen-($send-2))] }
              :if ( $extra = "  " )  do={ :set $delimiter [:pick $sline ($slen-$send) ($slen-($send-3))] }
              :if ( $extra = "   " ) do={ :set $delimiter [:pick $sline ($slen-$send) ($slen-($send-4))] }
             }; # EndIf result
	      } while (!$result); # EndDoWhile
	    }; #IF sline posix
	:set $sdata [:pick $sdata ([:find $sdata "\n"]+1) [:len $sdata]]; # cut off the already searched lines
	:if ($delimiter != null) do={:local sdata [:toarray ""]} ; #Clearing sdata array ending the WhileDo loop
    }; #WHILE END $sdata
    :local sdata [:toarray ""]
   :if ([:len $delimiter] = 0) do={ :set $delimiter "\n"; :set $delimiterShow "New Line" } else={ :set $delimiterShow $delimiter }; # when empty use NewLine 20220529	
   #} else={:put "User defind Posix: $ownposix"; :set $posix $ownposix } ; # ENDIF ownposix = null
   :if ($delimiter != null && $displayed ) do={:set $displayed false; :put "Using config provided delimiter: \"$delimiterShow\""}
   :if ($posix = null) do={:set $posix "."}; # Use a match all posix if nothing is defined or found 
   :if (!retryflag) do={:put "Reading Part: $x $start - $end"}   
   :if ($timeout = null) do={:local timeout 00:00:00}; # if no timeout is defined make it a static entry.    
   # Only remove the first line only if you are not at the start of list
   
:while ( [:pick $data 0 1] = $remarksign) do={ :set $data [:pick $data ([:find $data "\n"]+1) [:len $data]] }; # removes the invalid line (Spamhaus) 
   
   :if ($start > 0) do={:set $data [:pick $data ([:find $data "\n"]+1) [:len $data]]}
     :while ([:len $data]!=0) do={
       :local line [:pick $data 0 [:find $data "\n"]]; # create only once and checked twice as local variable
       :if ( $line ~ $posix && $line~heirule) do={    
        :do {add list=$listname address=[:pick $data 0 [:find $data $delimiter]] comment=$comment timeout=$timeout; :set $counter ($counter + 1)} on-error={}; # on error avoids any panics        
       }; # if IP address && extra filter if present
      :set $data [:pick $data ([:find $data "\n"]+1) [:len $data]]; # removes the just added IP from the data array
      # Cut of the end of the chunks by removing the last lines...very dirty but it works
      :if ([:len $data] < 256) do={:set $data [:toarray ""]}    
     }; # while

  :set $start (($start-512) + $maxsize); # shifts the subquential start back by 512  
  :set $end (($end-512) + $maxsize); # shift the subquential ends back by 512 to keep the 
  }; # if retryflag
 }; #do for x
 
}; # for retry
 :if ($counter < 1) do={:set $resultline "Import was NOT successfull! Check if the list $listname is still being maintained."} else={:set $resultline "Completed reading $counter items into address-list $listname." } 
 :put $resultline
 :if ($nolog = null) do={:log warning $resultline }
 :if ($counter > 0) do={:do {/ip firewall address-list remove [find where list=("backup".$listname)]} on-error={} } else={
 :do {:foreach i in=[/ip firewall address-list find list=("backup".$listname)] do={/ip firewall address-list set list=$listname $i }} on-error={}
 :put "Restoring backup list: $listname" 
 :if ($nolog = null) do={:log warning "Restoring backup list: $listname"}
 }; # if counter restore on failure and remove on success
}; # do
$update url=https://www.spamhaus.org/drop/drop.txt listname=spamhaus remarksign=";" timeout=1d nolog=1
$update url=https://lists.blocklist.de/lists/all.txt listname=blockDE timeout=1d nolog=1
}

# To be used configline settings:
# url=	        https://name.of.the.list
# listname=	name of address-list

# Optinal settings
# timeout=	the time the entry should be active. If omited then static entries are created.
# comment=	puts this comment on every line in the choosen address-list (default: no comment)
# heirule=	this will select on a word on each line if to import or not (default: no heirule)
# noerase=	any value, then the current list is not erased (default: erase)
# ownPosix=	allow to enter a onw regEX posix to be used (not ative at this moment)
# nolog=        any value, then don't write to the log (default: writing to log)
I removed the previous versions of this script to avoid any confusion.

Updated the textual part of the script so it states when "\n" NewLine is enforced.

Very nice work - thank you!
But I found a small issue with this i think
If no delimiter is passed(so "\n" is used) and the last line on the ip-list is not terminated with an "\n", the last line seems to be ignored.
 
User avatar
vdias
newbie
Posts: 28
Joined: Sat Apr 14, 2012 12:09 am

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Sat Jul 08, 2023 3:48 am

Using this list on my setup for some years now...
$update url=https://feeds.dshield.org/block.txt description=DShield delimiter=("\t") cidr=/24
$update url=http://www.spamhaus.org/drop/drop.txt description="Spamhaus DROP" delimiter=("\_")
$update url=http://www.spamhaus.org/drop/edrop.txt description="Spamhaus EDROP" delimiter=("\_")
$update url=https://sslbl.abuse.ch/blacklist/sslipblacklist.txt description="Abuse.ch SSLBL" delimiter=("\r")
$update url=https://raw.githubusercontent.com/ktsaou/blocklist-ipsets/master/firehol_level2.netset description="FireHOL Level2" delimiter=("\n")
$update url=https://lists.blocklist.de/lists/all.txt description="BlockList.de" delimiter=("\n")
Any newer recommend list?

And just to be sure... what is latest recommended version of the script?
 
kevinds
Long time Member
Long time Member
Posts: 657
Joined: Wed Jan 14, 2015 8:41 am

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Sat Jul 08, 2023 8:17 am

Using this list on my setup for some years now...
Why? Some of those include the others?
 
User avatar
rextended
Forum Guru
Forum Guru
Posts: 12558
Joined: Tue Feb 25, 2014 12:49 pm
Location: Italy
Contact:

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Sat Jul 08, 2023 9:58 am

Using this list on my setup for some years now...
Are you an ISP, or is your ISP not acting as an ISP?
Too much list will kill you...
 
User avatar
vdias
newbie
Posts: 28
Joined: Sat Apr 14, 2012 12:09 am

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Sat Jul 08, 2023 5:57 pm

Not a ISP, just my home configuration.

What list are you using? Which list are contain in others?
 
kevinds
Long time Member
Long time Member
Posts: 657
Joined: Wed Jan 14, 2015 8:41 am

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Tue Dec 19, 2023 7:03 pm

For my routers, this stopped working with the v7.13 update..
invalid value of "to", must be integer
The very old, original version, that had the maximum 63kb size limit, still works though..

While the script runs, seeing this many times in the log, I believe it is each 'chunk'?
Download from https://user:***@FQDN/folder/file to RAM FAILED: Fetch failed with status 206
ending with
Download from https://user:***@FQDN/folder/file to <nothing> FINISHED

However another list just shows once in the log,
Download from https://anonymous:***@FQDN/kevinds/folder/all.txt to <nothing> FINISHED
and it is 380kb

Same error when running from a shell,
/system/script> run blocklist-update
Starting import of address-list: blocklist
invalid value of "to", must be integer
Anybody else having this issue? Or just me?
 
User avatar
mozerd
Forum Veteran
Forum Veteran
Posts: 927
Joined: Thu Oct 05, 2017 3:39 pm
Location: Canada
Contact:

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Tue Dec 19, 2023 7:53 pm

@kevinds. “Anybody else having this issue? Or just me?”

Yes this version 7.13 is a problem that’s already reported in the 7.13 upgrade thread

I has to downgrade to 7.12.1 where all my scripts worked ….
 
kevinds
Long time Member
Long time Member
Posts: 657
Joined: Wed Jan 14, 2015 8:41 am

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Tue Dec 19, 2023 7:56 pm

Thank you.
 
User avatar
rextended
Forum Guru
Forum Guru
Posts: 12558
Joined: Tue Feb 25, 2014 12:49 pm
Location: Italy
Contact:

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Wed Dec 20, 2023 7:43 pm

From 7.14beta changelog:
*) fetch - do not require "content-length" for HTTP (introduced in v7.13);
*) fetch - treat any 2xx HTTP return code as success (introduced in v7.13);
 
kevinds
Long time Member
Long time Member
Posts: 657
Joined: Wed Jan 14, 2015 8:41 am

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Wed Dec 20, 2023 10:35 pm

I'm kinda surprised/annoyed they don't make those a v7.13.1 update.. There is still time, maybe they will.. They kinda broke fetch for many use-cases.
 
An5teifo
Frequent Visitor
Frequent Visitor
Posts: 89
Joined: Mon Dec 13, 2021 10:51 am
Location: Austria

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Sun Jan 21, 2024 10:13 am

I just upgraded to 7.14beta7 to verify if fetch is working for this but it does not.
 
kevinds
Long time Member
Long time Member
Posts: 657
Joined: Wed Jan 14, 2015 8:41 am

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Sun Jan 21, 2024 11:31 am

Did you try v7.13.2?
 
An5teifo
Frequent Visitor
Frequent Visitor
Posts: 89
Joined: Mon Dec 13, 2021 10:51 am
Location: Austria

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Sun Jan 21, 2024 11:33 am

Yes, I tried that as well.
I just received the same error messages as you @kevinds
 
User avatar
mhenriques
newbie
Posts: 49
Joined: Sat Mar 23, 2019 8:45 pm
Location: BRAZIL
Contact:

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Sun Jan 21, 2024 2:39 pm

I found this script on the forum. It works OK on my hEX S running 7.13.2.
The only change I've made was to concentrate all entries on a single "blacklist" and select the entries via the comment field.

MH

:global readfile do={
    :local url        $1
    :local thefile    ""
    :local filesize   ([/tool fetch url=$url as-value output=none]->"downloaded")
    :local maxsize    64512 ; # is the maximum supported readable size of a block from a file
    :local start      0
    :local end        ($maxsize - 1)
    :local partnumber ($filesize / ($maxsize / 1024))
    :local reminder   ($filesize % ($maxsize / 1024))
    :if ($reminder > 0) do={ :set partnumber ($partnumber + 1) }
    :for x from=1 to=$partnumber step=1 do={
         :set thefile ($thefile . ([/tool fetch url=$url http-header-field="Range: bytes=$start-$end" as-value output=user]->"data"))
         :set start   ($start + $maxsize)
         :set end     ($end   + $maxsize)
    }
    :return $thefile
}

{
/ip firewall address-list
:local update do={
 :global readfile
 :put "Starting import of address-list: $listname"
  :put "Deleting all Dynamic enties in address-list: $listname"
  :if (heirule != null) do={:put "Using as extra filtering: $heirule"}
  :if ($heirule = null) do={:set $heirule "."}
  :local n 0; # counter
  
 # remove the current list completely
 :do { /ip firewall address-list remove [find where comment=$description dynamic]} on-error={};
### line replaced ###  :local data ([:tool fetch url=$url output=user as-value]->"data")
   :local data [$readfile $url]
   :put "Imported file length $[:len $data] bytes"
     :while ([:len $data]!=0) do={ 
       :local line [:pick $data 0 [:find $data "\n"]]; # create only once and checked twice as local variable
       :if ($line~"^[0-9]{1,3}\\.[0-9]{1,3}\\.[0-9]{1,3}\\.[0-9]{1,3}" && $line~heirule) do={
        :set $n ($n+1) 
        :do {add list=$listname address=[:pick $data 0 [:find $data $delimiter]] comment=$description timeout=$timeout} on-error={};
       }; # if IP address && extra filter if present
      :set data [:pick $data ([:find $data "\n"]+1) [:len $data]]; # removes the just added IP from the data array
     }; # while
 :put "Completed importing $listname added/replacing $n lines."
}; # do

$update url=("https://" . "lists.blocklist.de/lists/all.txt") delimiter=("\n") listname=blacklist description=BlockDE timeout=1d
$update url=("https://" . "www.dshield.org/block.txt") delimiter=("\t") listname=blacklist description=DShield timeout=1d
$update url=("https://" . "iplists.firehol.org/files/firehol_level2.netset") delimiter=("\n") listname=blacklist description=FireHOLL2 timeout=1d
$update url=("https://" . "view.sentinel.turris.cz/greylist-data/greylist-latest.csv") delimiter="," listname=blacklist description=GreyList timeout=1d heirule=http|smtp
$update url=("https://" . "www.spamhaus.org/drop/drop.txt") delimiter=" ; " listname=blacklist description=SpamHaus timeout=1d
$update url=("https://" . "sslbl.abuse.ch/blacklist/sslipblacklist.txt") delimiter=("\r") listname=blacklist description=SSLBL timeout=1d

}

 
An5teifo
Frequent Visitor
Frequent Visitor
Posts: 89
Joined: Mon Dec 13, 2021 10:51 am
Location: Austria

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Sun Jan 21, 2024 6:16 pm

I found this script on the forum. It works OK on my hEX S running 7.13.2.
The only change I've made was to concentrate all entries on a single "blacklist" and select the entries via the comment field.

MH

:global readfile do={
    :local url        $1
    :local thefile    ""
    :local filesize   ([/tool fetch url=$url as-value output=none]->"downloaded")
    :local maxsize    64512 ; # is the maximum supported readable size of a block from a file
    :local start      0
    :local end        ($maxsize - 1)
    :local partnumber ($filesize / ($maxsize / 1024))
    :local reminder   ($filesize % ($maxsize / 1024))
    :if ($reminder > 0) do={ :set partnumber ($partnumber + 1) }
    :for x from=1 to=$partnumber step=1 do={
         :set thefile ($thefile . ([/tool fetch url=$url http-header-field="Range: bytes=$start-$end" as-value output=user]->"data"))
         :set start   ($start + $maxsize)
         :set end     ($end   + $maxsize)
    }
    :return $thefile
}

{
/ip firewall address-list
:local update do={
 :global readfile
 :put "Starting import of address-list: $listname"
  :put "Deleting all Dynamic enties in address-list: $listname"
  :if (heirule != null) do={:put "Using as extra filtering: $heirule"}
  :if ($heirule = null) do={:set $heirule "."}
  :local n 0; # counter
  
 # remove the current list completely
 :do { /ip firewall address-list remove [find where comment=$description dynamic]} on-error={};
### line replaced ###  :local data ([:tool fetch url=$url output=user as-value]->"data")
   :local data [$readfile $url]
   :put "Imported file length $[:len $data] bytes"
     :while ([:len $data]!=0) do={ 
       :local line [:pick $data 0 [:find $data "\n"]]; # create only once and checked twice as local variable
       :if ($line~"^[0-9]{1,3}\\.[0-9]{1,3}\\.[0-9]{1,3}\\.[0-9]{1,3}" && $line~heirule) do={
        :set $n ($n+1) 
        :do {add list=$listname address=[:pick $data 0 [:find $data $delimiter]] comment=$description timeout=$timeout} on-error={};
       }; # if IP address && extra filter if present
      :set data [:pick $data ([:find $data "\n"]+1) [:len $data]]; # removes the just added IP from the data array
     }; # while
 :put "Completed importing $listname added/replacing $n lines."
}; # do

$update url=("https://" . "lists.blocklist.de/lists/all.txt") delimiter=("\n") listname=blacklist description=BlockDE timeout=1d
$update url=("https://" . "www.dshield.org/block.txt") delimiter=("\t") listname=blacklist description=DShield timeout=1d
$update url=("https://" . "iplists.firehol.org/files/firehol_level2.netset") delimiter=("\n") listname=blacklist description=FireHOLL2 timeout=1d
$update url=("https://" . "view.sentinel.turris.cz/greylist-data/greylist-latest.csv") delimiter="," listname=blacklist description=GreyList timeout=1d heirule=http|smtp
$update url=("https://" . "www.spamhaus.org/drop/drop.txt") delimiter=" ; " listname=blacklist description=SpamHaus timeout=1d
$update url=("https://" . "sslbl.abuse.ch/blacklist/sslipblacklist.txt") delimiter=("\r") listname=blacklist description=SSLBL timeout=1d

}


This script also runs on the current beta without any problems.
 
kevinds
Long time Member
Long time Member
Posts: 657
Joined: Wed Jan 14, 2015 8:41 am

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Mon Jan 22, 2024 9:54 pm

# remove the current list completely
 :do { /ip firewall address-list remove [find where comment=$description dynamic]} on-error={};
### line replaced ###  :local data ([:tool fetch url=$url output=user as-value]->"data")
I don't like that line..

With the script this thread is focused on, it renews the time left. Looking at this, being that the script, it won't, since it expects that the address list has been removed.

Not to mention it removes any/all dynamic entries.. That is bad too.

Cool though.
 
An5teifo
Frequent Visitor
Frequent Visitor
Posts: 89
Joined: Mon Dec 13, 2021 10:51 am
Location: Austria

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Mon Jan 22, 2024 9:57 pm

It just removes the entries with a specific comment
find where comment=$description dynamic]
The other, more modern script does not work for me as I get the integer error.
Last edited by An5teifo on Tue Jan 23, 2024 9:13 am, edited 1 time in total.
 
kevinds
Long time Member
Long time Member
Posts: 657
Joined: Wed Jan 14, 2015 8:41 am

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Mon Jan 22, 2024 10:02 pm

The other, more modern script does not work on me as I get the integer error.
I'm skipping v7.13, going see what changes v7.14 and maybe v7.15 bring and then take a look at fixing the script after that, since fetch is mentioned in the beta changelog.

v7.12 is fine for my networks, what I needed in v7.13 doesn't work on my hardware anyways.
 
An5teifo
Frequent Visitor
Frequent Visitor
Posts: 89
Joined: Mon Dec 13, 2021 10:51 am
Location: Austria

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Tue Jan 23, 2024 9:23 am

The other, more modern script does not work on me as I get the integer error.
I'm skipping v7.13, going see what changes v7.14 and maybe v7.15 bring and then take a look at fixing the script after that, since fetch is mentioned in the beta changelog.

v7.12 is fine for my networks, what I needed in v7.13 doesn't work on my hardware anyways.
Which script are you have been using?
 
kevinds
Long time Member
Long time Member
Posts: 657
Joined: Wed Jan 14, 2015 8:41 am

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Tue Jan 23, 2024 1:55 pm

Which script are you have been using?
I'm using a modified version of this one viewtopic.php?p=935938&sid=9a9086e98c87 ... ed#p935938
 
An5teifo
Frequent Visitor
Frequent Visitor
Posts: 89
Joined: Mon Dec 13, 2021 10:51 am
Location: Austria

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Tue Jan 23, 2024 8:26 pm

I have tried the script you've mentioned with the today released 7.14beta8 version - it still does not work.

Is there a possibility to debug a script to see where the error occurs?
So far I just removed the "nolog" parameter but the only thing I get is
Starting import of address-list: spamhaus
Conditional deleting all entries in address-list: spamhaus
invalid value of "to", must be integer
So I think that $filesize does not get the correct value from the previous fetch command.
 
User avatar
diamuxin
Member
Member
Posts: 340
Joined: Thu Sep 09, 2021 5:46 pm

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Tue Jan 23, 2024 8:50 pm

I have tried the script you've mentioned with the today released 7.14beta8 version - it still does not work.

Is there a possibility to debug a script to see where the error occurs?
So far I just removed the "nolog" parameter but the only thing I get is
Starting import of address-list: spamhaus
Conditional deleting all entries in address-list: spamhaus
invalid value of "to", must be integer
So I think that $filesize does not get the correct value from the previous fetch command.
RB4011 + v7.13.2, works correctly with this version of the script:
viewtopic.php?p=1051118#p1050411

output code

[admin@MikroTik] > system script run test1
Starting import of address-list: blacklist
Deleting all Dynamic enties in address-list: blacklist
Imported file length 341441 bytes
Completed importing blacklist added/replacing 23829 lines.
Starting import of address-list: blacklist
Deleting all Dynamic enties in address-list: blacklist
Imported file length 2191 bytes
Completed importing blacklist added/replacing 20 lines.
Starting import of address-list: blacklist
Deleting all Dynamic enties in address-list: blacklist
Imported file length 256461 bytes
Completed importing blacklist added/replacing 17809 lines.
Starting import of address-list: blacklist
Deleting all Dynamic enties in address-list: blacklist
Using as extra filtering: http|smtp
Imported file length 279228 bytes
Completed importing blacklist added/replacing 3330 lines.
Starting import of address-list: blacklist
Deleting all Dynamic enties in address-list: blacklist
Imported file length 27410 bytes
Completed importing blacklist added/replacing 985 lines.
Starting import of address-list: blacklist
Deleting all Dynamic enties in address-list: blacklist
Imported file length 1049 bytes
Completed importing blacklist added/replacing 38 lines.
 
An5teifo
Frequent Visitor
Frequent Visitor
Posts: 89
Joined: Mon Dec 13, 2021 10:51 am
Location: Austria

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Tue Jan 23, 2024 9:18 pm

Which script are you have been using?
I'm using a modified version of this one viewtopic.php?p=935938&sid=9a9086e98c87 ... ed#p935938
I think I found the issue: Your script requires a "total" value after the fetch command.
Currently it is not being included at "as-value":
downloaded=26;duration=00:00:00;status=finished
 
koalabambu
just joined
Posts: 7
Joined: Tue Dec 05, 2023 11:32 pm

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Mon Jan 29, 2024 3:36 am

Hi guys,
maybe someone can combine the script snippets and up the full one?
would be nice - hard to follow over links jumping from Y2Y ;)

thx
 
koalabambu
just joined
Posts: 7
Joined: Tue Dec 05, 2023 11:32 pm

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Mon Jan 29, 2024 4:00 am



I'm using a modified version of this one viewtopic.php?p=935938&sid=9a9086e98c87 ... ed#p935938
I think I found the issue: Your script requires a "total" value after the fetch command.
Currently it is not being included at "as-value":
downloaded=26;duration=00:00:00;status=finished

Hiiiiii,
could you share the whole script pls ?
thx
 
An5teifo
Frequent Visitor
Frequent Visitor
Posts: 89
Joined: Mon Dec 13, 2021 10:51 am
Location: Austria

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Mon Jan 29, 2024 1:05 pm



I think I found the issue: Your script requires a "total" value after the fetch command.
Currently it is not being included at "as-value":
downloaded=26;duration=00:00:00;status=finished

Hiiiiii,
could you share the whole script pls ?
thx

You can find it here -> viewtopic.php?p=996869#p996869
 
kevinds
Long time Member
Long time Member
Posts: 657
Joined: Wed Jan 14, 2015 8:41 am

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Mon Jan 29, 2024 1:27 pm

,
could you share the whole script pls ?
You quoted a link to it...?
 
RSE
just joined
Posts: 5
Joined: Sun Feb 18, 2024 10:26 pm

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Sat Mar 02, 2024 3:50 pm

I found this script on the forum. It works OK on my hEX S running 7.13.2.
The only change I've made was to concentrate all entries on a single "blacklist" and select the entries via the comment field.

MH

:global readfile do={
    :local url        $1
    :local thefile    ""
    :local filesize   ([/tool fetch url=$url as-value output=none]->"downloaded")
    :local maxsize    64512 ; # is the maximum supported readable size of a block from a file
    :local start      0
    :local end        ($maxsize - 1)
    :local partnumber ($filesize / ($maxsize / 1024))
    :local reminder   ($filesize % ($maxsize / 1024))
    :if ($reminder > 0) do={ :set partnumber ($partnumber + 1) }
    :for x from=1 to=$partnumber step=1 do={
         :set thefile ($thefile . ([/tool fetch url=$url http-header-field="Range: bytes=$start-$end" as-value output=user]->"data"))
         :set start   ($start + $maxsize)
         :set end     ($end   + $maxsize)
    }
    :return $thefile
}

{
/ip firewall address-list
:local update do={
 :global readfile
 :put "Starting import of address-list: $listname"
  :put "Deleting all Dynamic enties in address-list: $listname"
  :if (heirule != null) do={:put "Using as extra filtering: $heirule"}
  :if ($heirule = null) do={:set $heirule "."}
  :local n 0; # counter
  
 # remove the current list completely
 :do { /ip firewall address-list remove [find where comment=$description dynamic]} on-error={};
### line replaced ###  :local data ([:tool fetch url=$url output=user as-value]->"data")
   :local data [$readfile $url]
   :put "Imported file length $[:len $data] bytes"
     :while ([:len $data]!=0) do={ 
       :local line [:pick $data 0 [:find $data "\n"]]; # create only once and checked twice as local variable
       :if ($line~"^[0-9]{1,3}\\.[0-9]{1,3}\\.[0-9]{1,3}\\.[0-9]{1,3}" && $line~heirule) do={
        :set $n ($n+1) 
        :do {add list=$listname address=[:pick $data 0 [:find $data $delimiter]] comment=$description timeout=$timeout} on-error={};
       }; # if IP address && extra filter if present
      :set data [:pick $data ([:find $data "\n"]+1) [:len $data]]; # removes the just added IP from the data array
     }; # while
 :put "Completed importing $listname added/replacing $n lines."
}; # do

$update url=("https://" . "lists.blocklist.de/lists/all.txt") delimiter=("\n") listname=blacklist description=BlockDE timeout=1d
$update url=("https://" . "www.dshield.org/block.txt") delimiter=("\t") listname=blacklist description=DShield timeout=1d
$update url=("https://" . "iplists.firehol.org/files/firehol_level2.netset") delimiter=("\n") listname=blacklist description=FireHOLL2 timeout=1d
$update url=("https://" . "view.sentinel.turris.cz/greylist-data/greylist-latest.csv") delimiter="," listname=blacklist description=GreyList timeout=1d heirule=http|smtp
$update url=("https://" . "www.spamhaus.org/drop/drop.txt") delimiter=" ; " listname=blacklist description=SpamHaus timeout=1d
$update url=("https://" . "sslbl.abuse.ch/blacklist/sslipblacklist.txt") delimiter=("\r") listname=blacklist description=SSLBL timeout=1d

}

This is awesome, i have it running now for a few days on my RB5009@v7.13.5 and it works flawlessly. It autoupdates the list every day
 
User avatar
mhenriques
newbie
Posts: 49
Joined: Sat Mar 23, 2019 8:45 pm
Location: BRAZIL
Contact:

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Wed Mar 06, 2024 1:51 pm

A problem I have identified on this script is derived from the fact that DShield list has an odd format. Instead of containing the IP address and subnet class, the file contains IP address Start Block, IP address End Block, and Subnet. To ensure the blacklist is populated with a subnet block instead of only the first IP block address we need to add code to treat this "special case".

Mauricio
 
texmeshtexas
Member Candidate
Member Candidate
Posts: 151
Joined: Sat Oct 11, 2008 11:17 pm

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Thu Mar 07, 2024 4:20 am

Hey all, i have a question.
For the last couple of years I've been building a system that I've been using for 2 business and my own home office.
I take all these OSINT list from many sources like firehol1,2,3, Spamhaus, Emerging Threats and more.
Also some paid lists from Bambeneck Consulting, AbuseIPdb, MalwarePatrol
Right now there are 115-120K IP list entries representing about 578M IPs. Most are updated hourly but some are less frequently per the list provider.
And I'm open to more from the community if it makes since.

I bring them all together onto a server, clear out things like RFC1918 and default addresses (both of which I've found on lists in the past),
remove duplicates among all the lists and create Mikrotik formatted .rsc files that can be downloaded and imported.

I wouldn't mind sharing this with the community if there is enough interest. But the server, lists and the work to maintain things cost some money so cant really do it for free. If there is enough interest I'll work to set up individual accounts for just sharing the lists.
I may offer 2-3 different mixes of lists that may be more suitable for your specific needs.
I'm thinking about US$100/yr or US$10/mo.

How many would be interested?
 
msatter
Forum Guru
Forum Guru
Posts: 2941
Joined: Tue Feb 18, 2014 12:56 am
Location: Netherlands / Nīderlande

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Thu Mar 07, 2024 9:50 am

You are not the first one: viewtopic.php?t=98804 and then the setup did not gain enough support to be continued.
 
User avatar
rextended
Forum Guru
Forum Guru
Posts: 12558
Joined: Tue Feb 25, 2014 12:49 pm
Location: Italy
Contact:

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Thu Mar 07, 2024 3:11 pm

I'm thinking about US$100/yr or US$10/mo.
Flashstart cost much less and is better implemented all over the world.
 
User avatar
mozerd
Forum Veteran
Forum Veteran
Posts: 927
Joined: Thu Oct 05, 2017 3:39 pm
Location: Canada
Contact:

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Thu Mar 07, 2024 3:27 pm

Hey all, i have a question.
For the last couple of years I've been building a system that I've been using for 2 business and my own home office.
.........
How many would be interested?
I more than welcome the competion ... :D

MOAB ... MOAB blocks over 600 million Bad Guys from attacking your Internet
PREREQUISITES for MikroTik Router's
10 day free Trial available for 1st time users
 
User avatar
rextended
Forum Guru
Forum Guru
Posts: 12558
Joined: Tue Feb 25, 2014 12:49 pm
Location: Italy
Contact:

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Thu Mar 07, 2024 4:45 pm

Also some paid lists from Bambeneck Consulting, AbuseIPdb, MalwarePatrol

I may offer 2-3 different mixes of lists that may be more suitable for your specific needs.
I'm thinking about US$100/yr or US$10/mo.
This is a scam, I don't think that whoever provides you with the paid lists agrees with the fact that you resell them...

At that point, if you have to be a criminal, subscribe to MOAB for $90 and resell the service for $50...
Earnings begin from the second user onwards...

The Internet is full of idiots who think that those lists do something at home or office.
At most they should be implemented by the ISP, because if one or more of those IPs attacks you, all the traffic will reach your home or office anyway, clogging up your connection...
 
texmeshtexas
Member Candidate
Member Candidate
Posts: 151
Joined: Sat Oct 11, 2008 11:17 pm

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Thu Mar 07, 2024 4:59 pm

I'm thinking about US$100/yr or US$10/mo.
Flashstart cost much less and is better implemented all over the world.
Flashstart is a DNS filtering system. Not IP based.
I actually run both using either Quad9(free no reporting) or NextDNS($20/yr but has controls and reporting).

DNS filtering is a bit tricky because simply setting a DNS address on your DHCP server, for example, to 9.9.9.9 is merely a suggestion to the client. Its not mandatory that the client use that DNS server address. In fact I've observed many IoT devices using 8.8.8.8 and all Apple devices using Apple DNS servers. So major gaps in protection.
My approach is to block DoH/DoT causing devices to resort to using standard DNS (UDP port 53) and then perform a dst-nat on that traffic to my DNS service of choice. This captures ALL devices on the network instead of just some of them.
 
texmeshtexas
Member Candidate
Member Candidate
Posts: 151
Joined: Sat Oct 11, 2008 11:17 pm

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Thu Mar 07, 2024 5:03 pm

Also some paid lists from Bambeneck Consulting, AbuseIPdb, MalwarePatrol

I may offer 2-3 different mixes of lists that may be more suitable for your specific needs.
I'm thinking about US$100/yr or US$10/mo.
This is a scam, I don't think that whoever provides you with the paid lists agrees with the fact that you resell them...

At that point, if you have to be a criminal, subscribe to MOAB for $90 and resell the service for $50...
Earnings begin from the second user onwards...

The Internet is full of idiots who think that those lists do something at home or office.
At most they should be implemented by the ISP, because if one or more of those IPs attacks you, all the traffic will reach your home or office anyway, clogging up your connection...
The lists I pay for can be used for commercial purposes.
And I agree, a DoS or DDoS attack cannot be fully handled at the edge router only.
But most infiltration attacks start with a simple probe to see if they can get in. Blocking that send the attacker looking elsewhere. Unless of course you are a giant target with lots of data or money to get. But then again, that is not what we are protecting. Just small business.
 
texmeshtexas
Member Candidate
Member Candidate
Posts: 151
Joined: Sat Oct 11, 2008 11:17 pm

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Thu Mar 07, 2024 5:09 pm

Hey all, i have a question.
For the last couple of years I've been building a system that I've been using for 2 business and my own home office.
.........
How many would be interested?
I more than welcome the competion ... :D

MOAB ... MOAB blocks over 600 million Bad Guys from attacking your Internet
PREREQUISITES for MikroTik Router's
10 day free Trial available for 1st time users
Hi Mozerd,
I could really never tell what exactly makes up the MOAB list. I was under the impression it was just the firehol lists. Not sure where I got that impression. Tell me if I'm wrong.
 
User avatar
rextended
Forum Guru
Forum Guru
Posts: 12558
Joined: Tue Feb 25, 2014 12:49 pm
Location: Italy
Contact:

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Thu Mar 07, 2024 5:22 pm

Flashstart is just one provocation... ;)

***********************

But most infiltration attacks start with a simple probe to see if they can get in.
This means that there is some basic error in the network configuration.
Everything must be blocked unless explicitly admitted,
so the IP list is absolutely useless.

If we are talking about a server/service that must necessarily be reachable from the Internet, that's another matter,
usually certain services must be placed in a datacenter, rather than at an amateur level inside the home or office...
 
texmeshtexas
Member Candidate
Member Candidate
Posts: 151
Joined: Sat Oct 11, 2008 11:17 pm

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Thu Mar 07, 2024 5:25 pm

You are not the first one: viewtopic.php?t=98804 and then the setup did not gain enough support to be continued.
Very different. I am not trying to create the lists my self. I rely on organizations who run Honey pots, or actively probe, or use AI to determine indicators of compromise and only with high confidence. For example, how would you know that a contact attempt from an unknown IP is a phishing or Bot C&C server or a malware distribution server. Takes more sophisticated systems to determine these things. There are hundreds of organizations and company's around the planet that focus on threat intelligence. I just try to make use of that info.
 
texmeshtexas
Member Candidate
Member Candidate
Posts: 151
Joined: Sat Oct 11, 2008 11:17 pm

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Thu Mar 07, 2024 5:36 pm

Flashstart is just one provocation... ;)

***********************

But most infiltration attacks start with a simple probe to see if they can get in.
This means that there is some basic error in the network configuration.
Everything must be blocked unless explicitly admitted,
so the IP list is absolutely useless.

If we are talking about a server/service that must necessarily be reachable from the Internet, that's another matter,
usually certain services must be placed in a datacenter, rather than at an amateur level inside the home or office...
Yes, some server/services are reachable from the internet such as local security camera/access control systems. Is that the most secure way to do it? No but it is what we deal with.

Here is also what I see, alot. Blocks outbound to bad IPs on the block lists. That means either someone clicked on a link in an email or may have brought a device to the office that is infected with malware and its trying to make contact with an outside server for instructions (C&C) or to exfiltrate data. Blocking this traffic is desirable. And DNS filtering may or may not stop it. If a malware writer wants to avoid DNS filters, and easy way is to not use DNS and just send exfiltrated data to a relay IP. I'm not a cyber criminal so dont know how common that is but its certainly possible. So blocking IPs is important also.
 
User avatar
rextended
Forum Guru
Forum Guru
Posts: 12558
Joined: Tue Feb 25, 2014 12:49 pm
Location: Italy
Contact:

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Thu Mar 07, 2024 5:53 pm

The biggest vulnerability that can you have on the internet are the little men (and women) sitting in front of a screen, with a keyboard and a mouse...
No matter how much you filter, if you have incompetent people at work, your network will always leak everywhere...
 
User avatar
mozerd
Forum Veteran
Forum Veteran
Posts: 927
Joined: Thu Oct 05, 2017 3:39 pm
Location: Canada
Contact:

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Thu Mar 07, 2024 7:33 pm

Hi Mozerd,
I could really never tell what exactly makes up the MOAB list. I was under the impression it was just the firehol lists. Not sure where I got that impression. Tell me if I'm wrong.
@texmeshtexas greetings 😀
You are not wrong …. But do you fully understand what makes up firehol …. [Overlaps of firehol_levels with other IP lists that number in the thousands …]

In any case I wish you Good Luck … may your endevour be successful…
 
User avatar
mozerd
Forum Veteran
Forum Veteran
Posts: 927
Joined: Thu Oct 05, 2017 3:39 pm
Location: Canada
Contact:

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Thu Mar 07, 2024 7:41 pm

…….
At most they should be implemented by the ISP, because if one or more of those IPs attacks you, all the traffic will reach your home or office anyway, clogging up your connection...
@rextended … salute 😀
I agree BUT very few do that ….
My subscribing clients are very pleased with my MOAB. Service ..
 
kevinds
Long time Member
Long time Member
Posts: 657
Joined: Wed Jan 14, 2015 8:41 am

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Thu Mar 07, 2024 8:35 pm

For example, how would you know that a contact attempt from an unknown IP is a phishing or Bot C&C server or a malware distribution server. Takes more sophisticated systems to determine these things. There are hundreds of organizations and company's around the planet that focus on threat intelligence. I just try to make use of that info.
The reason such services don't work for me is because depending on the traffic/list, I don't just want it blocked, I want to be notified about it.

Shit traffic can just be blocked sure, but a C&C server or malware distribution, I want to know if a host is trying to contact them, those lists I have set with 'log'.

The other issue you have with IP lists is that for example firehol is made up of many other IP lists, you need to find out from each of them, not just if they can be used in commercial services (protection on a business router), but if you are allowed to resell them yourself.
 
texmeshtexas
Member Candidate
Member Candidate
Posts: 151
Joined: Sat Oct 11, 2008 11:17 pm

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Fri Mar 08, 2024 2:39 am

Hi Mozerd,
I could really never tell what exactly makes up the MOAB list. I was under the impression it was just the firehol lists. Not sure where I got that impression. Tell me if I'm wrong.
@texmeshtexas greetings 😀
You are not wrong …. But do you fully understand what makes up firehol …. [Overlaps of firehol_levels with other IP lists that number in the thousands …]

In any case I wish you Good Luck … may your endevour be successful…
Yeah, As of today firehol 1,2,3 have 35.9K total entries but after I remove duplicates with other lists it has 19.2K.
An entry being a /32 or larger.
And after adding other lists I end up with a total of 115K entries. Not including DoH/DoT list of 3.9K entries.
My biggest issue with firehol is some of the lists are no longer valid because the provider either stopped offering it or the URL (like Snort) has change or format has changed (like DShield) and firehol maintainers are not staying on top of it. Despite people messaging them. But there is still alot of value there.
I'm also pulling in some IPv6 lists and have about 1.9K entries there.
 
texmeshtexas
Member Candidate
Member Candidate
Posts: 151
Joined: Sat Oct 11, 2008 11:17 pm

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Fri Mar 08, 2024 3:02 am

For example, how would you know that a contact attempt from an unknown IP is a phishing or Bot C&C server or a malware distribution server. Takes more sophisticated systems to determine these things. There are hundreds of organizations and company's around the planet that focus on threat intelligence. I just try to make use of that info.
The reason such services don't work for me is because depending on the traffic/list, I don't just want it blocked, I want to be notified about it.

Shit traffic can just be blocked sure, but a C&C server or malware distribution, I want to know if a host is trying to contact them, those lists I have set with 'log'.

The other issue you have with IP lists is that for example firehol is made up of many other IP lists, you need to find out from each of them, not just if they can be used in commercial services (protection on a business router), but if you are allowed to resell them yourself.
Yeah, I understand the value in that and I do that myself. Especially if the MT has LAN side visibility and can capture the LAN host talking out.
If you pulled the list separately instead of all one big list, you could do that. That's how I do it.
The real challenge is many Phishing sites are on shared IP servers with many other valid clean servers. I've see a single IP shared with up to 18000 hosts!! Thats crazy. Can result in some false detects from the end users perspective. So using domain filtering through DNS with logging (like NextDNS) is more precise.

Firehol is all public available open source. I'm not trying to make money on the lists. Just cover my costs to bring it all together, clean and format the lists for MT.
 
kevinds
Long time Member
Long time Member
Posts: 657
Joined: Wed Jan 14, 2015 8:41 am

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Fri Mar 08, 2024 6:48 am

Firehol is all public available open source. I'm not trying to make money on the lists. Just cover my costs to bring it all together, clean and format the lists for MT.
Alright, but the FireHOL Level 1 page starts out with,

"includes: bambenek_c2 dshield feodo fullbogons spamhaus_drop spamhaus_edrop sslbl ransomware_rw"

bambenek_c2 - Last updated 792 days contains 1 IP.

dshield - Last updated 458 days contains 5,120 IPs

feodo - Last updated 1913 days

FullBogons I need to filter these carefully as I do need to route these, there is NO WAY I will run something with an rsc script to import them

sslbl - Last updated 1909 days

ransomware_rw - Last updated 1551 days

"Firehol is all public available open source" - the software itself is. The lists they use to generate their IP lists, not all of them are.

Off-topic though, so I will leave this topic alone now.
 
texmeshtexas
Member Candidate
Member Candidate
Posts: 151
Joined: Sat Oct 11, 2008 11:17 pm

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Sat Mar 09, 2024 2:19 am

Firehol is all public available open source. I'm not trying to make money on the lists. Just cover my costs to bring it all together, clean and format the lists for MT.
Alright, but the FireHOL Level 1 page starts out with,

"includes: bambenek_c2 dshield feodo fullbogons spamhaus_drop spamhaus_edrop sslbl ransomware_rw"

bambenek_c2 - Last updated 792 days contains 1 IP.

dshield - Last updated 458 days contains 5,120 IPs

feodo - Last updated 1913 days

FullBogons I need to filter these carefully as I do need to route these, there is NO WAY I will run something with an rsc script to import them

sslbl - Last updated 1909 days

ransomware_rw - Last updated 1551 days

"Firehol is all public available open source" - the software itself is. The lists they use to generate their IP lists, not all of them are.

Off-topic though, so I will leave this topic alone now.

Yeah, my point exactly. Bambenek went paid. I pay for it and its not that expensive. couple hundred a year.
we strip RFC1918 addresses from all lists. As well as 0.0.0.0/0 and ::/0 as I've seen those show up before.

I just whitelist Loopback and Multicast ranges if needed.
Every implementation should have a WhiteList capability.

I'm always in search for a good OSINT list or an affordable one and I add it if 1) its maintained regularly and 2) it adds at least a couple hundred new unique entries to the overall set.
 
kevinds
Long time Member
Long time Member
Posts: 657
Joined: Wed Jan 14, 2015 8:41 am

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Sat Mar 09, 2024 6:00 am

I'm always in search for a good OSINT list or an affordable one and I add it if 1) its maintained regularly and 2) it adds at least a couple hundred new unique entries to the overall set.
My downloaded blacklists have +/- 120,000 entries, then my automated defenses add around 100,000 after two weeks..

I'm working on a new generator this weekend that I'm hoping will cover more of the illegal activities traffic, should be around 1,000 more network prefixes.
 
kevinds
Long time Member
Long time Member
Posts: 657
Joined: Wed Jan 14, 2015 8:41 am

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Sat Mar 09, 2024 8:47 am

Ok... I need help, my address lists are not being imported correctly, they are missing entries..

1085 out of 1100 on my latest creation, and 75 out of 93 on another..

RouterOS v7.12.2

{
/ip firewall address-list
:local update do={
 :put "Starting import of address-list: $listname"
 :if ($nolog = null) do={:log warning "Starting import of address-list: $listname"}

 :local maxretry 3
 :local retrywaitingtime 120s
 :local retryflag true
 :for retry from=1 to=$maxretry step=1 do={
  :if (retryflag) do={ :set $retryflag false; :set $sounter 0
  :if (retry > 1) do={
   :put "Source file changed. Retring after a $retrywaitingtime wait..."
   :if ($nolog = null) do={:log warning "Source file changed. Retring after a $retrywaitingtime wait..."}
   :delay $retrywaitingtime  }
  
 :local filesize ([/tool fetch url=$url src-address=$IP user=$user password=$password keep-result=no as-value]->"total")
 :local start 0
 :local maxsize 64000;	        # reqeusted chunk size
 :local end ($maxsize - 1);	# because start is zero the maxsize has to be reduced by one
 :local partnumber	 ($filesize / ($maxsize / 1024)); # how many chunk are maxsize
 :local remainder	 ($filesize % ($maxsize / 1024)); # the last partly chunk 
 :if ($remainder > 0)    do={ :set $partnumber ($partnumber + 1) }; # total number of chunks
 :if ($heirule != null) do={:put "Using as extra filtering: $heirule"} else={:set $heirule "."}
 # remove the current list completely if "erase" is not present (default setting)
  :if ($noerase = null) do={  
   :if ($timeout = null) do={:set $timeout 00:00:00; :do {:foreach i in=[/ip firewall address-list find list=$listname] do={/ip firewall address-list set list=("backup".$listname) $i }} on-error={} } else={
   :do {:foreach i in=[/ip firewall address-list find list=$listname dynamic] do={/ip firewall address-list set list=("backup".$listname) $i }} on-error={} };                
   :put ("Conditional deleting all".$dynamic." entries in address-list: $listname")
   :if ($nolog = null) do={:log warning ("Conditional deleting all".$dynamic." entries in address-list: $listname")}
  } else={:put "Entries not conditional deleted in address-list: $listname"}; # ENDIF ERASE
 :for x from=1 to=$partnumber step=1 do={
   # get filesize to be compared to the orignal one and if changed then retry
   :local comparesize ([/tool fetch url=$url src-address=$IP user=$user password=$password keep-result=no as-value]->"total")
   
#:set $comparesize 5 

   # fetching the chunks from the webserver when the size of the source file has not changed
   # empty array when the source file changed. No processing is done till the next complete retry
   :if ($comparesize = $filesize) do={:set $data ([:tool fetch url=$url src-address=$IP user=$user password=$password http-header-field="Range: bytes=$start-$end" output=user as-value]->"data")} else={:set $data [:toarray ""]; :set $retryflag true}
     #:if ($ownposix = null) do={
  # determining the used delimiter in the list if not provided in the config
   # this only run once and so the impact on the import time is low
    :local ipv4Posix	  "^[0-9]{1,3}\\.[0-9]{1,3}\\.[0-9]{1,3}\\.[0-9]{1,3}"
    :local ipv4rangePosix "^[0-9]{1,3}\\.[0-9]{1,3}\\.[0-9]{1,3}\\.[0-9]{1,3}/[0-9]{1,2}"
    :local domainPosix	  "^.+\\.[a-z.]{2,7}"
    :local sdata $data;     
    :while ([:len $sdata]!=0 && $delimiter = null) do={ # The check on length of $sdata is for if no delimiter is found.
       	:local sline [:pick $sdata 0 [:find $sdata "\n"]]; :local slen [:len $sline];
       	# set posix depending of type of data used in the list
       	:if ($sline ~ $ipv4Posix)	do={:set $posix $ipv4Posix;	 :set $iden "List identified as a IPv4 list"}
       	:if ($sline ~ $ipv4rangePosix)	do={:set $posix $ipv4rangePosix; :set $iden "List identified as a IPv4 with ranges list"}
       	:if ($sline ~ $domainPosix)	do={:set $posix $domainPosix;	 :set $iden "List identified as a domain list"}
       	:if ($sline ~ $posix) do={:put $iden}
      	:if ($sline ~ $posix) do={ # only explore the line if there is match at the start of the line.
	 :do {:if ([:pick $sline 0 ($slen-$send)] ~ ($posix."\$")|| $send > $slen) do={:set $delimiter [:pick $sline ($slen-$send) ($slen-($send-1))]; :set $result true} else={:set $send ($send+1);} } while (!$result);
	}; #IF posix
	:set $sdata [:pick $sdata ([:find $sdata "\n"]+1) [:len $sdata]];
	:if ($delimiter != null) do={:local sdata [:toarray ""]}; #Clear array sdata and it is not needed anymore and triggering so the While to end
    }; #WHILE END $sdata
    :local sdata [:toarray ""] 
   #} else={:put "User defind Posix: $ownposix"; :set $posix $ownposix } ; # ENDIF ownposix = null   
   :if ($posix = null && $delimiter != null) do={:set $posix "."; :put "Using config-line defined delimiter: \"$delimiter\""}; # delimter provided by config line
   :if (!retryflag) do={:put "Reading Part: $x $start - $end"}   
   :if ($timeout = null) do={:local timeout 00:00:00}; # if no timeout is defined make it a static entry.    
   # Only remove the first line only if you are not at the start of list
   
   :if ($start > 0) do={:set $data [:pick $data ([:find $data "\n"]+1) [:len $data]]}
     :while ([:len $data]!=0) do={
       :local line [:pick $data 0 [:find $data "\n"]]; # create only once and checked twice as local variable
       :if ( $line ~ $posix && $line~heirule) do={    
        :do {add list=$listname address=[:pick $data 0 [:find $data $delimiter]] comment=$comment timeout=$timeout; :set $counter ($counter + 1)} on-error={}; # on error avoids any panics        
       }; # if IP address && extra filter if present
      :set $data [:pick $data ([:find $data "\n"]+1) [:len $data]]; # removes the just added IP from the data array
      # Cut of the end of the chunks by removing the last lines...very dirty but it works
      :if ([:len $data] < 256) do={:set $data [:toarray ""]}    
     }; # while

  :set $start (($start-512) + $maxsize); # shifts the subquential start back by 512  
  :set $end (($end-512) + $maxsize); # shift the subquential ends back by 512 to keep the 
  }; # if retryflag
 }; #do for x
 
}; # for retry
 :if ($counter < 1) do={:set $resultline "Import was NOT successfull! Check if the list $listname is still being maintained."} else={:set $resultline "Completed reading $counter items into address-list $listname." } 
 :put $resultline
 :if ($nolog = null) do={:log warning $resultline }
 :if ($counter > 0) do={:do {/ip firewall address-list remove [find where list=("backup".$listname)]} on-error={} } else={
 :do {:foreach i in=[/ip firewall address-list find list=("backup".$listname)] do={/ip firewall address-list set list=$listname $i }} on-error={}
 :put "Restoring backup list: $listname" 
 :if ($nolog = null) do={:log warning "Restoring backup list: $listname"}
 }; # if counter restore on failure and remove on success
}; # do
:global currentIP
$update url=https://www.example.com/UptimeRobot/IPv4.txt IP=$currentIP listname=UptimeRobot user="anonymous" password="anonymous" timeout=5d noerase=1 comment="UptimeRobot" delimiter=("\n")
}

# To be used configline settings:
# url=	        https://name.of.the.list
# listname=	name of address-list

# Optinal settings
# timeout=	the time the entry should be active. If omited then static entries are created.
# comment=	puts this comment on every line in the choosen address-list (default: no comment)
# heirule=	this will select on a word on each line if to import or not (default: no heirule)
# noerase=	any value, then the current list is not erased (default: erase)
# ownPosix=	allow to enter a onw regEX posix to be used (not ative at this moment)
# nolog=        any value, then don't write to the log (default: writing to log)
# delimiter=     manually set the delimiter

So the list here,

https://uptimerobot.com/inc/files/ips/IPv4.txt (I'll get the v6 version when I get my v6 routing working correctly)

I run it though dos2unix first to remove the carriage return at the end of each line, then it goes on a normal webserver so my various RouterOS systems can grab it. I tried a few delimiter variations but in the end it was simpler to just 'fix' the file.
Completed reading 75 items into address-list UptimeRobot.
There are 93 entries in that list.

My 1085 out of 1100 comes from Spamhaus ASN-Drop list.. I downloaded all of the IP ranges for all the ASNs listed in the asndrop.json list, ran them through aggregate and ended up with 1100 (before 2326). Only 1085 import though.

The other lists I use, I haven't really paid attention to if they are missing entries or not. I noticed the UptimeRobot list was missing entries because some were missing missing when we checked the firewall. I looked at the file in a hex viewer and every line is exactly the same, I don't know why it isn't importing properly. It isn't the 64kb issue as both of these are well below that.

What am I missing? Is the importer script I'm using flawed? I know there have been many versions and variations of this one..

I'm still on v7.12.2 because v7.13 broke this script and the features I am interested in the later versions don't work on my models, so no good reason to update.
 
User avatar
patrikg
Member
Member
Posts: 362
Joined: Thu Feb 07, 2013 6:38 pm
Location: Stockholm, Sweden

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Sat Mar 09, 2024 10:13 am

Yeah, my point exactly. Bambenek went paid. I pay for it and its not that expensive. couple hundred a year.
we strip RFC1918 addresses from all lists. As well as 0.0.0.0/0 and ::/0 as I've seen those show up before.

I just whitelist Loopback and Multicast ranges if needed.
Every implementation should have a WhiteList capability.

I'm always in search for a good OSINT list or an affordable one and I add it if 1) its maintained regularly and 2) it adds at least a couple hundred new unique entries to the overall set.
Don't forget to strip the new CGNat (rfc6598) range as well.
between 100.64.0.0 and 100.127.255.255
CIDR notation for this group is 100.64.0.0/10

And when I am on it, you may also strip the multicast addresses(rfc1112) as well,
between 224.0.0.0 and 239.255.255.255
CIDR notation for this group is 224.0.0.0/4

And the linked local(rfc3927)
between 169.254.0.0 and 169.254.255.255
CIDR notation for this group is 169.254.0.0/16
 
kevinds
Long time Member
Long time Member
Posts: 657
Joined: Wed Jan 14, 2015 8:41 am

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Sat Mar 09, 2024 10:33 am

Yeah, my point exactly. Bambenek went paid. I pay for it and its not that expensive. couple hundred a year.
we strip RFC1918 addresses from all lists. As well as 0.0.0.0/0 and ::/0 as I've seen those show up before.
I don't think I've ever seen a list with 0.0.0.0/0.. That would stand out..

I don't need/want the RFC1918 ranges stripped out of my firewall though, I have all the reserved IPs already listed though and I do see traffic from the internet from those source IPs..

Some of the 0.0.0.0/8 IPs I do see in one blacklist I use.. Who decides to include those.. Your firewall should automatically be dropping them to begin with.
 
texmeshtexas
Member Candidate
Member Candidate
Posts: 151
Joined: Sat Oct 11, 2008 11:17 pm

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Mon Mar 11, 2024 1:12 am

Yeah, my point exactly. Bambenek went paid. I pay for it and its not that expensive. couple hundred a year.
we strip RFC1918 addresses from all lists. As well as 0.0.0.0/0 and ::/0 as I've seen those show up before.

I just whitelist Loopback and Multicast ranges if needed.
Every implementation should have a WhiteList capability.

I'm always in search for a good OSINT list or an affordable one and I add it if 1) its maintained regularly and 2) it adds at least a couple hundred new unique entries to the overall set.
Don't forget to strip the new CGNat (rfc6598) range as well.
between 100.64.0.0 and 100.127.255.255
CIDR notation for this group is 100.64.0.0/10

And when I am on it, you may also strip the multicast addresses(rfc1112) as well,
between 224.0.0.0 and 239.255.255.255
CIDR notation for this group is 224.0.0.0/4

And the linked local(rfc3927)
between 169.254.0.0 and 169.254.255.255
CIDR notation for this group is 169.254.0.0/16
Yeah, generally whitelist milticast and linked local.
But CGNat is a good idea. Have not run into customers using CGNat yet, but will eventually.
 
kevinds
Long time Member
Long time Member
Posts: 657
Joined: Wed Jan 14, 2015 8:41 am

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Mon Mar 18, 2024 6:24 pm

Ok... I need help, my address lists are not being imported correctly, they are missing entries..
I think this part is the cause of my issues,
      # Cut of the end of the chunks by removing the last lines...very dirty but it works
      :if ([:len $data] < 256) do={:set $data [:toarray ""]}    
There are 1294 entries in the list.
Completed reading 1276 items into address-list 
I'm currently working on adapting this,
:global readfile do={
    :local url        $1
	:global currentIP
    :local thefile    ""
    :local filesize   ([/tool fetch  src-address=$currentIP url=$url as-value output=none]->"downloaded")
    :local maxsize    64512 ; # is the maximum supported readable size of a block from a file
    :local start      0
    :local end        ($maxsize - 1)
    :local partnumber ($filesize / ($maxsize / 1024))
    :local reminder   ($filesize % ($maxsize / 1024))
    :if ($reminder > 0) do={ :set partnumber ($partnumber + 1) }
    :for x from=1 to=$partnumber step=1 do={
         :set thefile ($thefile . ([/tool fetch src-address=$currentIP url=$url http-header-field="Range: bytes=$start-$end" as-value output=user]->"data"))
         :set start   ($start + $maxsize)
         :set end     ($end   + $maxsize)
    }
    :return $thefile
}

{
/ip firewall address-list
:local update do={
 :global readfile
 :put "Starting import of address-list: $listname"
  :put "Deleting all Dynamic enties in address-list: $listname"
  :if (heirule != null) do={:put "Using as extra filtering: $heirule"}
  :if ($heirule = null) do={:set $heirule "."}
  :local n 0; # counter
  
 # remove the current list completely
 :do { /ip firewall address-list remove [find where list=$listname dynamic]} on-error={};
### line replaced ###  :local data ([:tool fetch src-address=$currentIP url=$url output=user as-value]->"data")
   :local data [$readfile $url]
   :put "Imported file length $[:len $data] bytes"
     :while ([:len $data]!=0) do={ 
       :local line [:pick $data 0 [:find $data "\n"]]; # create only once and checked twice as local variable
       :if ($line~"^[0-9]{1,3}\\.[0-9]{1,3}\\.[0-9]{1,3}\\.[0-9]{1,3}" && $line~heirule) do={
        :set $n ($n+1) 
        :do {add list=$listname address=[:pick $data 0 [:find $data $delimiter]] comment=$description timeout=$timeout} on-error={};
       }; # if IP address && extra filter if present
      :set data [:pick $data ([:find $data "\n"]+1) [:len $data]]; # removes the just added IP from the data array
     }; # while
 :put "Completed importing $listname added/replacing $n lines."
}; # do

$update url=("https://www.example.com/kevinds/IPv4.txt") delimiter=("\n") listname=Whitelist timeout=8d comment="Whitelist for x"
}
To include this feature from Variant 2 in the first post.
the script does NOT delete actual addresses, but prolongs their timeout.
The primary reason is that I also use it for whitelists and deleting the list will break connections each time the list is deleted. The second, is knowing how long an IP has been continuously listed for, is interesting to me. Lastly so the blacklist firewall stays up all the time.. No gap while the router processes the list, which is minor, but still a factor.


But I don't understand how the array part works (yet?).
 
UkRainUa
newbie
Posts: 39
Joined: Sun Mar 10, 2024 3:10 am

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Tue Mar 19, 2024 1:56 am


Alright, but the FireHOL Level 1 page starts out with,

"includes: bambenek_c2 dshield feodo fullbogons spamhaus_drop spamhaus_edrop sslbl ransomware_rw"

bambenek_c2 - Last updated 792 days contains 1 IP.

dshield - Last updated 458 days contains 5,120 IPs

feodo - Last updated 1913 days

FullBogons I need to filter these carefully as I do need to route these, there is NO WAY I will run something with an rsc script to import them

sslbl - Last updated 1909 days

ransomware_rw - Last updated 1551 days

"Firehol is all public available open source" - the software itself is. The lists they use to generate their IP lists, not all of them are.

Off-topic though, so I will leave this topic alone now.
So, original script (post #1) works fine with spamhaus (ros 7.14):
$update url=https://www.spamhaus.org/drop/drop.txt description="Spamhaus DROP" delimiter=("\_")
$update url=https://www.spamhaus.org/drop/edrop.txt description="Spamhaus EDROP" delimiter=("\_")
ip firewall address-list
:local update do={
:do {
:local data ([:tool fetch url=$url output=user as-value]->"data")
:local array [find dynamic list=blacklist]
:foreach value in=$array do={:set array (array,[get $value address])}
:while ([:len $data]!=0) do={
:if ([:pick $data 0 [:find $data "\n"]]~"^[0-9]{1,3}\\.[0-9]{1,3}\\.[0-9]{1,3}\\.[0-9]{1,3}") do={
:local ip ([:pick $data 0 [:find $data $delimiter]].$cidr)
:do {add list=blacklist address=$ip comment=$description timeout=1d} on-error={
:do {set ($array->([:find $array $ip]-[:len $array]/2)) timeout=1d} on-error={}
}
}
:set data [:pick $data ([:find $data "\n"]+1) [:len $data]]
}
} on-error={:log warning "Address list <$description> update failed"}
}
$update url=https://www.spamhaus.org/drop/drop.txt description="Spamhaus DROP" delimiter=("\_")
$update url=https://www.spamhaus.org/drop/edrop.txt description="Spamhaus EDROP" delimiter=("\_")
And see also:
https://www.spamhaus.org/resource-hub/n ... ngle-list/
From April 10th, 2024, Spamhaus eDROP (Extended Don’t Route Or Peer) data will be consolidated into the DROP lists, meaning eDROP will no longer be published separately. Read on for a closer look at why these changes are being implemented and what this means for those affected.

PS And firehol_level1 works too (31.3 KB)
https://iplists.firehol.org/
"Does it have a consistent size through time?
We don't want surprises. Sudden increases or decreases is generally an indication of poor maintainance."
$update url=https://raw.githubusercontent.com/firehol/blocklist-ipsets/master/firehol_level1.netset description="firehol_level1" delimiter=("\n")
Last edited by UkRainUa on Tue Mar 19, 2024 11:29 am, edited 4 times in total.
 
kevinds
Long time Member
Long time Member
Posts: 657
Joined: Wed Jan 14, 2015 8:41 am

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Tue Mar 19, 2024 3:42 am

From April 10th, 2024, Spamhaus eDROP (Extended Don’t Route Or Peer) data will be consolidated into the DROP lists, meaning eDROP will no longer be published separately. Read on for a closer look at why these changes are being implemented and what this means for those affected.
Yes, those address lists are small.. Remove the line for the one being removed, that is it, done.
 
User avatar
mhenriques
newbie
Posts: 49
Joined: Sat Mar 23, 2019 8:45 pm
Location: BRAZIL
Contact:

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Tue Mar 19, 2024 5:37 pm

Hello

I've tested this list on the following script and it seems to load OK on ROuteOS 7.13.5.
#
# Script to build firewall blacklist based on various Internet lists
#
:global readfile do={
    :local url        $1
    :local thefile    ""
    :local filesize   ([/tool fetch url=$url as-value output=none]->"downloaded"); # calculate the downloaded list size
    :local UrlAccess  ([/tool fetch url=$url as-value output=none]->"status"); # verify if access is OK
    :if ($UrlAccess = "finished" && $filesize = 0) do={:set filesize 1}; # treating special case where size = 0 but list exists
    :local maxsize    64512 ; # is the maximum supported readable size of a block from a file
    :local start      0
    :local end        ($maxsize - 1)
    :local partnumber ($filesize / ($maxsize / 1024))
    :local remainder   ($filesize % ($maxsize / 1024))
    :if ($remainder > 0) do={ :set partnumber ($partnumber + 1) }; # if remainder != 0 then add extra block for processing
    :for x from=1 to=$partnumber step=1 do={
         :set thefile ($thefile . ([/tool fetch url=$url http-header-field="Range: bytes=$start-$end" as-value output=user]->"data"))
         :set start   ($start + $maxsize)
         :set end     ($end   + $maxsize)
    }; # end for loop
    :return $thefile;
}; # end global function

{
/ip firewall address-list
#
# Function to refresh Internet blacklist entries
#
:local update do={
 :global readfile
 :if (heirule != null) do={:put "Using as extra filtering: $heirule"}
 :if ($heirule = null) do={:set $heirule "."}
 :local n 0; # counter
#
# remove the current list entries completely
#
 :put "Deleting all Dynamic enties in address-list: $description"
 :do { /ip firewall address-list remove [find where comment=$description dynamic]} on-error={};
 :put "Starting import of address-list: $description"
 :local data [$readfile $url]; #call global function to read Internet list
   :put "Imported file length $[:len $data] bytes"
     :while ([:len $data]!=0) do={ 
       :local line [:pick $data 0 [:find $data "\n"]]; # create only once and checked twice as local variable
       :if ($line ~ "^[0-9]{1,3}\\.[0-9]{1,3}\\.[0-9]{1,3}\\.[0-9]{1,3}" && $line~heirule) do={
        :set $n ($n+1)
        :if ($description="DShield") do={
          :do {add list=$listname address=([:pick $data 0 [:find $data $delimiter]] . "/24") comment=$description timeout=$timeout} on-error={};
        } else={
          :do {add list=$listname address=[:pick $data 0 [:find $data $delimiter]] comment=$description timeout=$timeout} on-error={};
          }
       ##
       ## :do {add list=$listname address=[:pick $data 0 [:find $data $delimiter]] comment=$description timeout=$timeout} on-error={};
       ##
       }; # if IP address && extra filter if present
      :set data [:pick $data ([:find $data "\n"]+1) [:len $data]]; # removes the just added IP from the data array
     }; # end while loop
 :put "Completed importing $listname added/replacing $n lines."
}; # do - End of refresh function

# $update url=("https://" . "lists.blocklist.de/lists/all.txt") delimiter=("\n") listname=blacklist description=BlockDE timeout=3d
$update url=("https://" . "www.dshield.org/block.txt") delimiter=("\t") listname=blacklist description=DShield timeout=3d
# $update url=("https://" . "iplists.firehol.org/files/firehol_level2.netset") delimiter=("\n") listname=blacklist description=FireHOLL2 timeout=3d
# $update url=("https://" . "view.sentinel.turris.cz/greylist-data/greylist-latest.csv") delimiter="," listname=blacklist description=GreyList timeout=3d heirule=http|smtp
$update url=("https://" . "www.spamhaus.org/drop/drop.txt") delimiter=" ; " listname=blacklist description=SpamHaus timeout=3d
$update url=("https://" . "sslbl.abuse.ch/blacklist/sslipblacklist.txt") delimiter=("\r") listname=blacklist description=SSLBL timeout=3d
$update url=("https://" . "uptimerobot.com/inc/files/ips/IPv4.txt") delimiter=("\r") listname=blacklist description=Robot timeout=3d

}; # End of main code
Ok... I need help, my address lists are not being imported correctly, they are missing entries..

1085 out of 1100 on my latest creation, and 75 out of 93 on another..

RouterOS v7.12.2

{
/ip firewall address-list
:local update do={
 :put "Starting import of address-list: $listname"
 :if ($nolog = null) do={:log warning "Starting import of address-list: $listname"}

 :local maxretry 3
 :local retrywaitingtime 120s
 :local retryflag true
 :for retry from=1 to=$maxretry step=1 do={
  :if (retryflag) do={ :set $retryflag false; :set $sounter 0
  :if (retry > 1) do={
   :put "Source file changed. Retring after a $retrywaitingtime wait..."
   :if ($nolog = null) do={:log warning "Source file changed. Retring after a $retrywaitingtime wait..."}
   :delay $retrywaitingtime  }
  
 :local filesize ([/tool fetch url=$url src-address=$IP user=$user password=$password keep-result=no as-value]->"total")
 :local start 0
 :local maxsize 64000;	        # reqeusted chunk size
 :local end ($maxsize - 1);	# because start is zero the maxsize has to be reduced by one
 :local partnumber	 ($filesize / ($maxsize / 1024)); # how many chunk are maxsize
 :local remainder	 ($filesize % ($maxsize / 1024)); # the last partly chunk 
 :if ($remainder > 0)    do={ :set $partnumber ($partnumber + 1) }; # total number of chunks
 :if ($heirule != null) do={:put "Using as extra filtering: $heirule"} else={:set $heirule "."}
 # remove the current list completely if "erase" is not present (default setting)
  :if ($noerase = null) do={  
   :if ($timeout = null) do={:set $timeout 00:00:00; :do {:foreach i in=[/ip firewall address-list find list=$listname] do={/ip firewall address-list set list=("backup".$listname) $i }} on-error={} } else={
   :do {:foreach i in=[/ip firewall address-list find list=$listname dynamic] do={/ip firewall address-list set list=("backup".$listname) $i }} on-error={} };                
   :put ("Conditional deleting all".$dynamic." entries in address-list: $listname")
   :if ($nolog = null) do={:log warning ("Conditional deleting all".$dynamic." entries in address-list: $listname")}
  } else={:put "Entries not conditional deleted in address-list: $listname"}; # ENDIF ERASE
 :for x from=1 to=$partnumber step=1 do={
   # get filesize to be compared to the orignal one and if changed then retry
   :local comparesize ([/tool fetch url=$url src-address=$IP user=$user password=$password keep-result=no as-value]->"total")
   
#:set $comparesize 5 

   # fetching the chunks from the webserver when the size of the source file has not changed
   # empty array when the source file changed. No processing is done till the next complete retry
   :if ($comparesize = $filesize) do={:set $data ([:tool fetch url=$url src-address=$IP user=$user password=$password http-header-field="Range: bytes=$start-$end" output=user as-value]->"data")} else={:set $data [:toarray ""]; :set $retryflag true}
     #:if ($ownposix = null) do={
  # determining the used delimiter in the list if not provided in the config
   # this only run once and so the impact on the import time is low
    :local ipv4Posix	  "^[0-9]{1,3}\\.[0-9]{1,3}\\.[0-9]{1,3}\\.[0-9]{1,3}"
    :local ipv4rangePosix "^[0-9]{1,3}\\.[0-9]{1,3}\\.[0-9]{1,3}\\.[0-9]{1,3}/[0-9]{1,2}"
    :local domainPosix	  "^.+\\.[a-z.]{2,7}"
    :local sdata $data;     
    :while ([:len $sdata]!=0 && $delimiter = null) do={ # The check on length of $sdata is for if no delimiter is found.
       	:local sline [:pick $sdata 0 [:find $sdata "\n"]]; :local slen [:len $sline];
       	# set posix depending of type of data used in the list
       	:if ($sline ~ $ipv4Posix)	do={:set $posix $ipv4Posix;	 :set $iden "List identified as a IPv4 list"}
       	:if ($sline ~ $ipv4rangePosix)	do={:set $posix $ipv4rangePosix; :set $iden "List identified as a IPv4 with ranges list"}
       	:if ($sline ~ $domainPosix)	do={:set $posix $domainPosix;	 :set $iden "List identified as a domain list"}
       	:if ($sline ~ $posix) do={:put $iden}
      	:if ($sline ~ $posix) do={ # only explore the line if there is match at the start of the line.
	 :do {:if ([:pick $sline 0 ($slen-$send)] ~ ($posix."\$")|| $send > $slen) do={:set $delimiter [:pick $sline ($slen-$send) ($slen-($send-1))]; :set $result true} else={:set $send ($send+1);} } while (!$result);
	}; #IF posix
	:set $sdata [:pick $sdata ([:find $sdata "\n"]+1) [:len $sdata]];
	:if ($delimiter != null) do={:local sdata [:toarray ""]}; #Clear array sdata and it is not needed anymore and triggering so the While to end
    }; #WHILE END $sdata
    :local sdata [:toarray ""] 
   #} else={:put "User defind Posix: $ownposix"; :set $posix $ownposix } ; # ENDIF ownposix = null   
   :if ($posix = null && $delimiter != null) do={:set $posix "."; :put "Using config-line defined delimiter: \"$delimiter\""}; # delimter provided by config line
   :if (!retryflag) do={:put "Reading Part: $x $start - $end"}   
   :if ($timeout = null) do={:local timeout 00:00:00}; # if no timeout is defined make it a static entry.    
   # Only remove the first line only if you are not at the start of list
   
   :if ($start > 0) do={:set $data [:pick $data ([:find $data "\n"]+1) [:len $data]]}
     :while ([:len $data]!=0) do={
       :local line [:pick $data 0 [:find $data "\n"]]; # create only once and checked twice as local variable
       :if ( $line ~ $posix && $line~heirule) do={    
        :do {add list=$listname address=[:pick $data 0 [:find $data $delimiter]] comment=$comment timeout=$timeout; :set $counter ($counter + 1)} on-error={}; # on error avoids any panics        
       }; # if IP address && extra filter if present
      :set $data [:pick $data ([:find $data "\n"]+1) [:len $data]]; # removes the just added IP from the data array
      # Cut of the end of the chunks by removing the last lines...very dirty but it works
      :if ([:len $data] < 256) do={:set $data [:toarray ""]}    
     }; # while

  :set $start (($start-512) + $maxsize); # shifts the subquential start back by 512  
  :set $end (($end-512) + $maxsize); # shift the subquential ends back by 512 to keep the 
  }; # if retryflag
 }; #do for x
 
}; # for retry
 :if ($counter < 1) do={:set $resultline "Import was NOT successfull! Check if the list $listname is still being maintained."} else={:set $resultline "Completed reading $counter items into address-list $listname." } 
 :put $resultline
 :if ($nolog = null) do={:log warning $resultline }
 :if ($counter > 0) do={:do {/ip firewall address-list remove [find where list=("backup".$listname)]} on-error={} } else={
 :do {:foreach i in=[/ip firewall address-list find list=("backup".$listname)] do={/ip firewall address-list set list=$listname $i }} on-error={}
 :put "Restoring backup list: $listname" 
 :if ($nolog = null) do={:log warning "Restoring backup list: $listname"}
 }; # if counter restore on failure and remove on success
}; # do
:global currentIP
$update url=https://www.example.com/UptimeRobot/IPv4.txt IP=$currentIP listname=UptimeRobot user="anonymous" password="anonymous" timeout=5d noerase=1 comment="UptimeRobot" delimiter=("\n")
}

# To be used configline settings:
# url=	        https://name.of.the.list
# listname=	name of address-list

# Optinal settings
# timeout=	the time the entry should be active. If omited then static entries are created.
# comment=	puts this comment on every line in the choosen address-list (default: no comment)
# heirule=	this will select on a word on each line if to import or not (default: no heirule)
# noerase=	any value, then the current list is not erased (default: erase)
# ownPosix=	allow to enter a onw regEX posix to be used (not ative at this moment)
# nolog=        any value, then don't write to the log (default: writing to log)
# delimiter=     manually set the delimiter

So the list here,

https://uptimerobot.com/inc/files/ips/IPv4.txt (I'll get the v6 version when I get my v6 routing working correctly)

I run it though dos2unix first to remove the carriage return at the end of each line, then it goes on a normal webserver so my various RouterOS systems can grab it. I tried a few delimiter variations but in the end it was simpler to just 'fix' the file.
Completed reading 75 items into address-list UptimeRobot.
There are 93 entries in that list.

My 1085 out of 1100 comes from Spamhaus ASN-Drop list.. I downloaded all of the IP ranges for all the ASNs listed in the asndrop.json list, ran them through aggregate and ended up with 1100 (before 2326). Only 1085 import though.

The other lists I use, I haven't really paid attention to if they are missing entries or not. I noticed the UptimeRobot list was missing entries because some were missing missing when we checked the firewall. I looked at the file in a hex viewer and every line is exactly the same, I don't know why it isn't importing properly. It isn't the 64kb issue as both of these are well below that.

What am I missing? Is the importer script I'm using flawed? I know there have been many versions and variations of this one..

I'm still on v7.12.2 because v7.13 broke this script and the features I am interested in the later versions don't work on my models, so no good reason to update.
 
kevinds
Long time Member
Long time Member
Posts: 657
Joined: Wed Jan 14, 2015 8:41 am

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Tue Mar 19, 2024 6:09 pm

I've tested this list on the following script and it seems to load OK on ROuteOS 7.13.5.
Indeed, that script does import properly, I added an update on what I figured out and why it wasn't loading the last few addresses..

Once I figure out the array stuff to update a list without deleting it first, I'll be good again.

My CCR1036 is showing signs that it is dying, so I haven't worked on this since.
 
ivicask
Member
Member
Posts: 438
Joined: Tue Jul 07, 2015 2:40 pm
Location: Croatia, Zagreb

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Tue Mar 19, 2024 6:17 pm

I found this script on the forum. It works OK on my hEX S running 7.13.2.
The only change I've made was to concentrate all entries on a single "blacklist" and select the entries via the comment field.

MH

:global readfile do={
    :local url        $1
    :local thefile    ""
    :local filesize   ([/tool fetch url=$url as-value output=none]->"downloaded")
    :local maxsize    64512 ; # is the maximum supported readable size of a block from a file
    :local start      0
    :local end        ($maxsize - 1)
    :local partnumber ($filesize / ($maxsize / 1024))
    :local reminder   ($filesize % ($maxsize / 1024))
    :if ($reminder > 0) do={ :set partnumber ($partnumber + 1) }
    :for x from=1 to=$partnumber step=1 do={
         :set thefile ($thefile . ([/tool fetch url=$url http-header-field="Range: bytes=$start-$end" as-value output=user]->"data"))
         :set start   ($start + $maxsize)
         :set end     ($end   + $maxsize)
    }
    :return $thefile
}

{
/ip firewall address-list
:local update do={
 :global readfile
 :put "Starting import of address-list: $listname"
  :put "Deleting all Dynamic enties in address-list: $listname"
  :if (heirule != null) do={:put "Using as extra filtering: $heirule"}
  :if ($heirule = null) do={:set $heirule "."}
  :local n 0; # counter
  
 # remove the current list completely
 :do { /ip firewall address-list remove [find where comment=$description dynamic]} on-error={};
### line replaced ###  :local data ([:tool fetch url=$url output=user as-value]->"data")
   :local data [$readfile $url]
   :put "Imported file length $[:len $data] bytes"
     :while ([:len $data]!=0) do={ 
       :local line [:pick $data 0 [:find $data "\n"]]; # create only once and checked twice as local variable
       :if ($line~"^[0-9]{1,3}\\.[0-9]{1,3}\\.[0-9]{1,3}\\.[0-9]{1,3}" && $line~heirule) do={
        :set $n ($n+1) 
        :do {add list=$listname address=[:pick $data 0 [:find $data $delimiter]] comment=$description timeout=$timeout} on-error={};
       }; # if IP address && extra filter if present
      :set data [:pick $data ([:find $data "\n"]+1) [:len $data]]; # removes the just added IP from the data array
     }; # while
 :put "Completed importing $listname added/replacing $n lines."
}; # do

$update url=("https://" . "lists.blocklist.de/lists/all.txt") delimiter=("\n") listname=blacklist description=BlockDE timeout=1d
$update url=("https://" . "www.dshield.org/block.txt") delimiter=("\t") listname=blacklist description=DShield timeout=1d
$update url=("https://" . "iplists.firehol.org/files/firehol_level2.netset") delimiter=("\n") listname=blacklist description=FireHOLL2 timeout=1d
$update url=("https://" . "view.sentinel.turris.cz/greylist-data/greylist-latest.csv") delimiter="," listname=blacklist description=GreyList timeout=1d heirule=http|smtp
$update url=("https://" . "www.spamhaus.org/drop/drop.txt") delimiter=" ; " listname=blacklist description=SpamHaus timeout=1d
$update url=("https://" . "sslbl.abuse.ch/blacklist/sslipblacklist.txt") delimiter=("\r") listname=blacklist description=SSLBL timeout=1d

}

This is awesome, i have it running now for a few days on my RB5009@v7.13.5 and it works flawlessly. It autoupdates the list every day
I droped this over my old script which runs on scheduler ONCE A DAY, and noticed it runs multiple times a day instead once as i set in scheduler.

How does this script run it self multiple times a day, i dont see it in code?
 
User avatar
mhenriques
newbie
Posts: 49
Joined: Sat Mar 23, 2019 8:45 pm
Location: BRAZIL
Contact:

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Tue Mar 19, 2024 7:04 pm

The last few lines starting with "$update" call the update function, once for each URL defined. Thus, the script runs once for each "$update" call that is not commented out. No commands to mess with the scheduler. Note that the script adds the entries on the blacklist with a 3-day timeout. Hence, at the end of 3 days the list will be empty. You can modify it at your discretion.

I'm not the author of this script. I only fixed the pieces I've needed and learned a lot of scripting in the process.

MH
 
kevinds
Long time Member
Long time Member
Posts: 657
Joined: Wed Jan 14, 2015 8:41 am

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Tue Mar 19, 2024 7:25 pm

I droped this over my old script which runs on scheduler ONCE A DAY, and noticed it runs multiple times a day instead once as i set in scheduler.

How does this script run it self multiple times a day, i dont see it in code?
Please run this command and post the output.

/system/scheduler export terse

Might need "/system/scripts/export terse" too, but not sure yet.
 
ivicask
Member
Member
Posts: 438
Joined: Tue Jul 07, 2015 2:40 pm
Location: Croatia, Zagreb

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Tue Mar 19, 2024 7:30 pm

The last few lines starting with "$update" call the update function, once for each URL defined. Thus, the script runs once for each "$update" call that is not commented out. No commands to mess with the scheduler. Note that the script adds the entries on the blacklist with a 3-day timeout. Hence, at the end of 3 days the list will be empty. You can modify it at your discretion.

I'm not the author of this script. I only fixed the pieces I've needed and learned a lot of scripting in the process.

MH
Not sure what to modify in script to disable this behaviour?

To be clear I only have this in scheduler(run 1 time a day 6 in the morning), yet it runs multiple times a day at random intervals which i dont want, im totally confused.
/system scheduler
add interval=1d name=BadIPList on-event="/system script run BADiplist" policy=\
    ftp,reboot,read,write,policy,test,password,sniff,sensitive,romon \
    start-date=2016-02-22 start-time=06:00:00
You do not have the required permissions to view the files attached to this post.
 
kevinds
Long time Member
Long time Member
Posts: 657
Joined: Wed Jan 14, 2015 8:41 am

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Tue Mar 19, 2024 7:45 pm

Not sure what to modify in script to disable this behaviour?
It isn't the script..

Your scheduler only has the one entry? Not that it really matters, there are many places that can start a script..

If you are seeing it in the logs multiple times, is there a pattern? What times do they show?
 
kevinds
Long time Member
Long time Member
Posts: 657
Joined: Wed Jan 14, 2015 8:41 am

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Tue Mar 19, 2024 7:47 pm

Not sure what to modify in script to disable this behaviour?
It isn't the script..

Your scheduler only has the one entry? Not that it really matters, there are many places that can start a script..

Please post the rest of your logs.. There will be something triggering it, without the rest of your logs my first guess would be on your DHCP-Client.. When it updates the lease, run the script..

Do a full export..

/export terse file=whatisgoingon.rsc

Even if you don't post it here, open it in a text editor, use the Find function for BadIPList You may see what else is starting it.
 
ivicask
Member
Member
Posts: 438
Joined: Tue Jul 07, 2015 2:40 pm
Location: Croatia, Zagreb

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Tue Mar 19, 2024 8:00 pm

Dont have DHCP clients or any dhcp scripts.
Already did full export and searched for this line, its nowhere else.
BTW have same issue on 5 different routers(diff models and diff RouterOS)
I have this scheduler+script for many years, i only copy pasted new updated script code into my (BADiplist) script and thats all i did and problem started .

Im gona try delete all of them and re-create them it must be some kind of bug.
 
kevinds
Long time Member
Long time Member
Posts: 657
Joined: Wed Jan 14, 2015 8:41 am

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Tue Mar 19, 2024 8:51 pm

Dont have DHCP clients or any dhcp scripts.
Configured static IP from your ISP then?
 
ivicask
Member
Member
Posts: 438
Joined: Tue Jul 07, 2015 2:40 pm
Location: Croatia, Zagreb

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Wed Mar 20, 2024 11:14 am

I have completely removed all scripts and re-named them and re-added.
Rebooted router,even tried update to latest beta 7.15, it still runs on its own on times i didint specify:
Last run Mar/20/2024 10:00:00, what the hell started it ???? Scheduler clearly shows next run Mar/21/2024 06:00:00, and it only ran once on router startup from scheduler as it should which was 2 hours before this 10:00:00 log print.
If i replace this code with simple code like :log info "Script ran" then i dont see it printed in log unless started by scheduler at specified times as it should.
/system script
add dont-require-permissions=yes name=01BadIpDownloader owner=admin policy=ftp,reboot,read,write,policy,test,password,sniff,sensitive,romon source=":gl\
    obal readfile do={\r\
    \n    :local url        \$1\r\
    \n    :local thefile    \"\"\r\
    \n    :local filesize   ([/tool fetch url=\$url as-value output=none]->\"downloaded\")\r\
    \n    :local maxsize    64512 ; # is the maximum supported readable size of a block from a file\r\
    \n    :local start      0\r\
    \n    :local end        (\$maxsize - 1)\r\
    \n    :local partnumber (\$filesize / (\$maxsize / 1024))\r\
    \n    :local reminder   (\$filesize % (\$maxsize / 1024))\r\
    \n    :if (\$reminder > 0) do={ :set partnumber (\$partnumber + 1) }\r\
    \n    :for x from=1 to=\$partnumber step=1 do={\r\
    \n         :set thefile (\$thefile . ([/tool fetch url=\$url http-header-field=\"Range: bytes=\$start-\$end\" as-value output=user]->\"data\"))\r\
    \n         :set start   (\$start + \$maxsize)\r\
    \n         :set end     (\$end   + \$maxsize)\r\
    \n    }\r\
    \n    :return \$thefile\r\
    \n}\r\
    \n\r\
    \n{\r\
    \n/ip firewall address-list\r\
    \n:local update do={\r\
    \n :global readfile\r\
    \n :put \"Starting import of address-list: \$listname\"\r\
    \n  :put \"Deleting all Dynamic enties in address-list: \$listname\"\r\
    \n  :if (heirule != null) do={:put \"Using as extra filtering: \$heirule\"}\r\
    \n  :if (\$heirule = null) do={:set \$heirule \".\"}\r\
    \n  :local n 0; # counter\r\
    \n  \r\
    \n # remove the current list completely\r\
    \n :do { /ip firewall address-list remove [find where comment=\$description dynamic]} on-error={};\r\
    \n### line replaced ###  :local data ([:tool fetch url=\$url output=user as-value]->\"data\")\r\
    \n   :local data [\$readfile \$url]\r\
    \n   :put \"Imported file length \$[:len \$data] bytes\"\r\
    \n     :while ([:len \$data]!=0) do={ \r\
    \n       :local line [:pick \$data 0 [:find \$data \"\\n\"]]; # create only once and checked twice as local variable\r\
    \n       :if (\$line~\"^[0-9]{1,3}\\\\.[0-9]{1,3}\\\\.[0-9]{1,3}\\\\.[0-9]{1,3}\" && \$line~heirule) do={\r\
    \n        :set \$n (\$n+1) \r\
    \n        :do {add list=\$listname address=[:pick \$data 0 [:find \$data \$delimiter]] comment=\$description timeout=\$timeout} on-error={};\r\
    \n       }; # if IP address && extra filter if present\r\
    \n      :set data [:pick \$data ([:find \$data \"\\n\"]+1) [:len \$data]]; # removes the just added IP from the data array\r\
    \n     }; # while\r\
    \n :put \"Completed importing \$listname added/replacing \$n lines.\"\r\
    \n}; # do\r\
    \n\r\
    \n\$update url=(\"https://\" . \"lists.blocklist.de/lists/all.txt\") delimiter=(\"\\n\") listname=blacklist description=BlockDE timeout=1d\r\
    \n\$update url=(\"https://\" . \"www.dshield.org/block.txt\") delimiter=(\"\\t\") listname=blacklist description=DShield timeout=1d\r\
    \n\$update url=(\"https://\" . \"iplists.firehol.org/files/firehol_level2.netset\") delimiter=(\"\\n\") listname=blacklist description=FireHOLL2 ti\
    meout=1d\r\
    \n\$update url=(\"https://\" . \"view.sentinel.turris.cz/greylist-data/greylist-latest.csv\") delimiter=\",\" listname=blacklist description=GreyLi\
    st timeout=1d heirule=http|smtp\r\
    \n\$update url=(\"https://\" . \"www.spamhaus.org/drop/drop.txt\") delimiter=\" ; \" listname=blacklist description=SpamHaus timeout=1d\r\
    \n\$update url=(\"https://\" . \"sslbl.abuse.ch/blacklist/sslipblacklist.txt\") delimiter=(\"\\r\") listname=blacklist description=SSLBL timeout=1d\
    \r\
    \n\r\
    \n}"
/system scheduler
add interval=1d name=01BadIpDownloader on-event="/system script run \"01BadIpDownloader\"\r\
    \n" policy=ftp,reboot,read,write,policy,test,password,sniff,sensitive start-date=2023-04-28 start-time=06:00:00
add name=02BadIpDownloader on-event="delay 20\r\
    \n/system script run \"01BadIpDownloader\"\r\
    \n" policy=ftp,reboot,read,write,policy,test,password,sniff,sensitive start-time=startup
	
You do not have the required permissions to view the files attached to this post.
 
kevinds
Long time Member
Long time Member
Posts: 657
Joined: Wed Jan 14, 2015 8:41 am

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Wed Mar 20, 2024 1:57 pm

Rebooted router,even tried update to latest beta 7.15, it still runs on its own on times i didint specify:
Your screenshots now only show it only running once now though, at 10:00. Are there other log entries that show it running at 8:00?

Is the time set correctly?

Please post your full config.
 
ivicask
Member
Member
Posts: 438
Joined: Tue Jul 07, 2015 2:40 pm
Location: Croatia, Zagreb

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Wed Mar 20, 2024 2:37 pm

Rebooted router,even tried update to latest beta 7.15, it still runs on its own on times i didint specify:
Your screenshots now only show it only running once now though, at 10:00. Are there other log entries that show it running at 8:00?

Is the time set correctly?

Please post your full config.
Yes i rebooted router in morning and it ran at 07:51 as per startup run set in scheduler.

Time is correct, still even if not why would it run every 3-4 hours makes no sense...

BTW it ran again now Mar/20/2024 13:01:15 and last run time updated but run count DID NOT increase! How can it run but count not increase??
You do not have the required permissions to view the files attached to this post.
 
UkRainUa
newbie
Posts: 39
Joined: Sun Mar 10, 2024 3:10 am

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Thu Mar 28, 2024 3:13 am

Looks like :for loop in :global readfile and something with fetch...
You can try
 /system logging add disabled=no topics=fetch
for more messages :)
 
MTNick
Member Candidate
Member Candidate
Posts: 106
Joined: Fri Nov 24, 2023 6:43 am

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Thu Mar 28, 2024 11:56 pm

Due to problem with "\n" have to be set manually I have adapted the script to do this for your this when no delimiter has been found:
{
/ip firewall address-list
:local update do={
 :put "Starting import of address-list: $listname"
 :if ($nolog = null) do={:log warning "Starting import of address-list: $listname"}
 
 :local displayed true
 :local maxretry 3
 :local retrywaitingtime 120s
 :local retryflag true
 :for retry from=1 to=$maxretry step=1 do={
  :if (retryflag) do={ :set $retryflag false; :set $sounter 0
  :if (retry > 1) do={
   :put "Source file changed. Retring after a $retrywaitingtime wait..."
   :if ($nolog = null) do={:log warning "Source file changed. Retring after a $retrywaitingtime wait..."}
   :delay $retrywaitingtime  }
  
  :local fetchResult [/tool fetch url=$url keep-result=no as-value]
  :local filesize ($fetchResult->"total")
  :local downsize ($fetchResult->"downloaded") 
  :if ($filesize = 0 && $downsize > 0) do={ :set $filesize $downsize}

  :local start 0
  :local maxsize 64000;	        # reqeusted chunk size
  :local end ($maxsize - 1);	# because start is zero the maxsize has to be reduced by one
  :local partnumber	 ($filesize / ($maxsize / 1024)); # how many chunk are maxsize
  :local remainder	 ($filesize % ($maxsize / 1024)); # the last partly chunk 
  :if ($remainder > 0)    do={ :set $partnumber ($partnumber + 1) }; # total number of chunks
  :if ($heirule != null) do={:put "Using as extra filtering: $heirule"} else={:set $heirule "."}
 # remove the current list completely if "erase" is not present (default setting)
  :if ($noerase = null) do={  
   :if ($timeout = null) do={:set $timeout 00:00:00; :do {:foreach i in=[/ip firewall address-list find list=$listname] do={/ip firewall address-list set list=("backup".$listname) $i }} on-error={} } else={
   :do {:foreach i in=[/ip firewall address-list find list=$listname dynamic] do={/ip firewall address-list set list=("backup".$listname) $i }} on-error={} };                
   :put ("Conditional deleting all".$dynamic." entries in address-list: $listname")
   :if ($nolog = null) do={:log warning ("Conditional deleting all".$dynamic." entries in address-list: $listname")}
  } else={:put "Entries not conditional deleted in address-list: $listname"}; # ENDIF ERASE
 :for x from=1 to=$partnumber step=1 do={
   # get filesize to be compared to the orignal one and if changed then retry
   :local comparesize ([/tool fetch url=$url keep-result=no as-value]->"total")
   :if ($comparesize = 0 && $downsize > 0) do={ :set $comparesize $downsize}
   
   # fetching the chunks from the webserver when the size of the source file has not changed
   # empty array when the source file changed. No processing is done till the next complete retry
   :if ($comparesize = $filesize) do={:set $data ([:tool fetch url=$url http-header-field="Range: bytes=$start-$end" output=user as-value]->"data")} else={:set $data [:toarray ""]; :set $retryflag true}
     #:if ($ownposix = null) do={
  # determining the used delimiter in the list, when not provided in the config
   # this only run once and so the impact on the import time is low
    :local ipv4Posix	  "^[0-9]{1,3}\\.[0-9]{1,3}\\.[0-9]{1,3}\\.[0-9]{1,3}"
    :local ipv4rangePosix "^[0-9]{1,3}\\.[0-9]{1,3}\\.[0-9]{1,3}\\.[0-9]{1,3}/[0-9]{1,2}"
    :local domainPosix	  "^.+\\.[a-z.]{2,7}"
    :local sdata $data;
   # removes any lines at the top of the file that could interfere with finding the correct posix. Setting remarksign is needed
    :while ([:pick $sdata 0 1] = $remarksign) do={ :set $sdata [:pick $sdata ([:find $sdata "\n"]+1) [:len $sdata]] }    
    :while ([:len $sdata]!=0 && $delimiter = null) do={ # The check on length of $sdata is for if no delimiter is found.   
       	:local sline [:pick $sdata 0 [:find $sdata "\n"]]; :local slen [:len $sline];
       	# set posix depending of type of data used in the list
       	:if ($sline ~ $ipv4Posix)	    do={:set $posix $ipv4Posix;	     :set $iden "List identified as a IPv4 list"}
       	:if ($sline ~ $ipv4rangePosix)	do={:set $posix $ipv4rangePosix; :set $iden "List identified as a IPv4 with ranges list"}
       	:if ($sline ~ $domainPosix)	    do={:set $posix $domainPosix;	 :set $iden "List identified as a domain list"}
       	:if ($sline ~ $posix) do={:put $iden}
      	:if ($sline ~ $posix) do={ # only explore the line if there is a match at the start of the line.
	      :do {:if ([:pick $sline 0 ($slen-$send)] ~ ($posix."\$") || $send > $slen) do={
	        :set $delimiter [:pick $sline ($slen-$send) ($slen-($send-1))]; :set $result true} else={:set $send ($send+1)}  
             :if ($result) do={ :set  $extra [:pick $sline ($slen-$send) ($slen-($send-1))]
              :if ( $extra = " " )   do={ :set $delimiter [:pick $sline ($slen-$send) ($slen-($send-2))] }
              :if ( $extra = "  " )  do={ :set $delimiter [:pick $sline ($slen-$send) ($slen-($send-3))] }
              :if ( $extra = "   " ) do={ :set $delimiter [:pick $sline ($slen-$send) ($slen-($send-4))] }
             }; # EndIf result
	      } while (!$result); # EndDoWhile
	    }; #IF sline posix
	:set $sdata [:pick $sdata ([:find $sdata "\n"]+1) [:len $sdata]]; # cut off the already searched lines
	:if ($delimiter != null) do={:local sdata [:toarray ""]} ; #Clearing sdata array ending the WhileDo loop
    }; #WHILE END $sdata
    :local sdata [:toarray ""]
   :if ([:len $delimiter] = 0) do={ :set $delimiter "\n"; :set $delimiterShow "New Line" } else={ :set $delimiterShow $delimiter }; # when empty use NewLine 20220529	
   #} else={:put "User defind Posix: $ownposix"; :set $posix $ownposix } ; # ENDIF ownposix = null
   :if ($delimiter != null && $displayed ) do={:set $displayed false; :put "Using config provided delimiter: \"$delimiterShow\""}
   :if ($posix = null) do={:set $posix "."}; # Use a match all posix if nothing is defined or found 
   :if (!retryflag) do={:put "Reading Part: $x $start - $end"}   
   :if ($timeout = null) do={:local timeout 00:00:00}; # if no timeout is defined make it a static entry.    
   # Only remove the first line only if you are not at the start of list
   
:while ( [:pick $data 0 1] = $remarksign) do={ :set $data [:pick $data ([:find $data "\n"]+1) [:len $data]] }; # removes the invalid line (Spamhaus) 
   
   :if ($start > 0) do={:set $data [:pick $data ([:find $data "\n"]+1) [:len $data]]}
     :while ([:len $data]!=0) do={
       :local line [:pick $data 0 [:find $data "\n"]]; # create only once and checked twice as local variable
       :if ( $line ~ $posix && $line~heirule) do={    
        :do {add list=$listname address=[:pick $data 0 [:find $data $delimiter]] comment=$comment timeout=$timeout; :set $counter ($counter + 1)} on-error={}; # on error avoids any panics        
       }; # if IP address && extra filter if present
      :set $data [:pick $data ([:find $data "\n"]+1) [:len $data]]; # removes the just added IP from the data array
      # Cut of the end of the chunks by removing the last lines...very dirty but it works
      :if ([:len $data] < 256) do={:set $data [:toarray ""]}    
     }; # while

  :set $start (($start-512) + $maxsize); # shifts the subquential start back by 512  
  :set $end (($end-512) + $maxsize); # shift the subquential ends back by 512 to keep the 
  }; # if retryflag
 }; #do for x
 
}; # for retry
 :if ($counter < 1) do={:set $resultline "Import was NOT successfull! Check if the list $listname is still being maintained."} else={:set $resultline "Completed reading $counter items into address-list $listname." } 
 :put $resultline
 :if ($nolog = null) do={:log warning $resultline }
 :if ($counter > 0) do={:do {/ip firewall address-list remove [find where list=("backup".$listname)]} on-error={} } else={
 :do {:foreach i in=[/ip firewall address-list find list=("backup".$listname)] do={/ip firewall address-list set list=$listname $i }} on-error={}
 :put "Restoring backup list: $listname" 
 :if ($nolog = null) do={:log warning "Restoring backup list: $listname"}
 }; # if counter restore on failure and remove on success
}; # do
$update url=https://www.spamhaus.org/drop/drop.txt listname=spamhaus remarksign=";" timeout=1d nolog=1
$update url=https://lists.blocklist.de/lists/all.txt listname=blockDE timeout=1d nolog=1
}

# To be used configline settings:
# url=	        https://name.of.the.list
# listname=	name of address-list

# Optinal settings
# timeout=	the time the entry should be active. If omited then static entries are created.
# comment=	puts this comment on every line in the choosen address-list (default: no comment)
# heirule=	this will select on a word on each line if to import or not (default: no heirule)
# noerase=	any value, then the current list is not erased (default: erase)
# ownPosix=	allow to enter a onw regEX posix to be used (not ative at this moment)
# nolog=        any value, then don't write to the log (default: writing to log)
I removed the previous versions of this script to avoid any confusion.

Updated the textual part of the script so it states when "\n" NewLine is enforced.

Greeting msatter. Is there a way to get spamhaus to work in this script? I've tried several delimiter settings but it won't download. This version is working well with every other link. For some reason, it won't work with spamhaus. The best logging I've seen is within this script. I've been using 2 different scripts to download the lists. But the other script doesn't log like this one does. Any help would be appreciated!

RB5009 with ROS 7.14.1
 
kevinds
Long time Member
Long time Member
Posts: 657
Joined: Wed Jan 14, 2015 8:41 am

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Fri Mar 29, 2024 4:25 pm

I've tried several delimiter settings but it won't download.
This one works for me on the Spamhaus drop list.
 delimiter=("\_")
 
MTNick
Member Candidate
Member Candidate
Posts: 106
Joined: Fri Nov 24, 2023 6:43 am

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Sat Mar 30, 2024 1:54 am

Greetings kevinds. Unfortunately I've tried that. It starts the download, then script just stops. I'm testing it on its own, the only one in the script & fails. Every other link works. Not sure what's up with it. No reason in the log, script just stops
{
/ip firewall address-list
:local update do={
 :put "Starting import of address-list: $listname"
 :if ($nolog = null) do={:log warning "Starting import of address-list: $listname"}
 
 :local displayed true
 :local maxretry 3
 :local retrywaitingtime 120s
 :local retryflag true
 :for retry from=1 to=$maxretry step=1 do={
  :if (retryflag) do={ :set $retryflag false; :set $sounter 0
  :if (retry > 1) do={
   :put "Source file changed. Retring after a $retrywaitingtime wait..."
   :if ($nolog = null) do={:log warning "Source file changed. Retring after a $retrywaitingtime wait..."}
   :delay $retrywaitingtime  }
  
  :local fetchResult [/tool fetch url=$url keep-result=no as-value]
  :local filesize ($fetchResult->"total")
  :local downsize ($fetchResult->"downloaded") 
  :if ($filesize = 0 && $downsize > 0) do={ :set $filesize $downsize}

  :local start 0
  :local maxsize 64000;	        # reqeusted chunk size
  :local end ($maxsize - 1);	# because start is zero the maxsize has to be reduced by one
  :local partnumber	 ($filesize / ($maxsize / 1024)); # how many chunk are maxsize
  :local remainder	 ($filesize % ($maxsize / 1024)); # the last partly chunk 
  :if ($remainder > 0)    do={ :set $partnumber ($partnumber + 1) }; # total number of chunks
  :if ($heirule != null) do={:put "Using as extra filtering: $heirule"} else={:set $heirule "."}
 # remove the current list completely if "erase" is not present (default setting)
  :if ($noerase = null) do={  
   :if ($timeout = null) do={:set $timeout 00:00:00; :do {:foreach i in=[/ip firewall address-list find list=$listname] do={/ip firewall address-list set list=("backup".$listname) $i }} on-error={} } else={
   :do {:foreach i in=[/ip firewall address-list find list=$listname dynamic] do={/ip firewall address-list set list=("backup".$listname) $i }} on-error={} };                
   :put ("Conditional deleting all".$dynamic." entries in address-list: $listname")
   :if ($nolog = null) do={:log warning ("Conditional deleting all".$dynamic." entries in address-list: $listname")}
  } else={:put "Entries not conditional deleted in address-list: $listname"}; # ENDIF ERASE
 :for x from=1 to=$partnumber step=1 do={
   # get filesize to be compared to the orignal one and if changed then retry
   :local comparesize ([/tool fetch url=$url keep-result=no as-value]->"total")
   :if ($comparesize = 0 && $downsize > 0) do={ :set $comparesize $downsize}
   
   # fetching the chunks from the webserver when the size of the source file has not changed
   # empty array when the source file changed. No processing is done till the next complete retry
   :if ($comparesize = $filesize) do={:set $data ([:tool fetch url=$url http-header-field="Range: bytes=$start-$end" output=user as-value]->"data")} else={:set $data [:toarray ""]; :set $retryflag true}
     #:if ($ownposix = null) do={
  # determining the used delimiter in the list, when not provided in the config
   # this only run once and so the impact on the import time is low
    :local ipv4Posix	  "^[0-9]{1,3}\\.[0-9]{1,3}\\.[0-9]{1,3}\\.[0-9]{1,3}"
    :local ipv4rangePosix "^[0-9]{1,3}\\.[0-9]{1,3}\\.[0-9]{1,3}\\.[0-9]{1,3}/[0-9]{1,2}"
    :local domainPosix	  "^.+\\.[a-z.]{2,7}"
    :local sdata $data;
   # removes any lines at the top of the file that could interfere with finding the correct posix. Setting remarksign is needed
    :while ([:pick $sdata 0 1] = $remarksign) do={ :set $sdata [:pick $sdata ([:find $sdata "\n"]+1) [:len $sdata]] }    
    :while ([:len $sdata]!=0 && $delimiter = null) do={ # The check on length of $sdata is for if no delimiter is found.   
       	:local sline [:pick $sdata 0 [:find $sdata "\n"]]; :local slen [:len $sline];
       	# set posix depending of type of data used in the list
       	:if ($sline ~ $ipv4Posix)	    do={:set $posix $ipv4Posix;	     :set $iden "List identified as a IPv4 list"}
       	:if ($sline ~ $ipv4rangePosix)	do={:set $posix $ipv4rangePosix; :set $iden "List identified as a IPv4 with ranges list"}
       	:if ($sline ~ $domainPosix)	    do={:set $posix $domainPosix;	 :set $iden "List identified as a domain list"}
       	:if ($sline ~ $posix) do={:put $iden}
      	:if ($sline ~ $posix) do={ # only explore the line if there is a match at the start of the line.
	      :do {:if ([:pick $sline 0 ($slen-$send)] ~ ($posix."\$") || $send > $slen) do={
	        :set $delimiter [:pick $sline ($slen-$send) ($slen-($send-1))]; :set $result true} else={:set $send ($send+1)}  
             :if ($result) do={ :set  $extra [:pick $sline ($slen-$send) ($slen-($send-1))]
              :if ( $extra = " " )   do={ :set $delimiter [:pick $sline ($slen-$send) ($slen-($send-2))] }
              :if ( $extra = "  " )  do={ :set $delimiter [:pick $sline ($slen-$send) ($slen-($send-3))] }
              :if ( $extra = "   " ) do={ :set $delimiter [:pick $sline ($slen-$send) ($slen-($send-4))] }
             }; # EndIf result
	      } while (!$result); # EndDoWhile
	    }; #IF sline posix
	:set $sdata [:pick $sdata ([:find $sdata "\n"]+1) [:len $sdata]]; # cut off the already searched lines
	:if ($delimiter != null) do={:local sdata [:toarray ""]} ; #Clearing sdata array ending the WhileDo loop
    }; #WHILE END $sdata
    :local sdata [:toarray ""]
   :if ([:len $delimiter] = 0) do={ :set $delimiter "\n"; :set $delimiterShow "New Line" } else={ :set $delimiterShow $delimiter }; # when empty use NewLine 20220529	
   #} else={:put "User defind Posix: $ownposix"; :set $posix $ownposix } ; # ENDIF ownposix = null
   :if ($delimiter != null && $displayed ) do={:set $displayed false; :put "Using config provided delimiter: \"$delimiterShow\""}
   :if ($posix = null) do={:set $posix "."}; # Use a match all posix if nothing is defined or found 
   :if (!retryflag) do={:put "Reading Part: $x $start - $end"}   
   :if ($timeout = null) do={:local timeout 00:00:00}; # if no timeout is defined make it a static entry.    
   # Only remove the first line only if you are not at the start of list
   
:while ( [:pick $data 0 1] = $remarksign) do={ :set $data [:pick $data ([:find $data "\n"]+1) [:len $data]] }; # removes the invalid line (Spamhaus) 
   
   :if ($start > 0) do={:set $data [:pick $data ([:find $data "\n"]+1) [:len $data]]}
     :while ([:len $data]!=0) do={
       :local line [:pick $data 0 [:find $data "\n"]]; # create only once and checked twice as local variable
       :if ( $line ~ $posix && $line~heirule) do={    
        :do {add list=$listname address=[:pick $data 0 [:find $data $delimiter]] comment=$comment timeout=$timeout; :set $counter ($counter + 1)} on-error={}; # on error avoids any panics        
       }; # if IP address && extra filter if present
      :set $data [:pick $data ([:find $data "\n"]+1) [:len $data]]; # removes the just added IP from the data array
      # Cut of the end of the chunks by removing the last lines...very dirty but it works
      :if ([:len $data] < 256) do={:set $data [:toarray ""]}    
     }; # while

  :set $start (($start-512) + $maxsize); # shifts the subquential start back by 512  
  :set $end (($end-512) + $maxsize); # shift the subquential ends back by 512 to keep the 
  }; # if retryflag
 }; #do for x
 
}; # for retry
 :if ($counter < 1) do={:set $resultline "Import was NOT successfull! Check if the list $listname is still being maintained."} else={:set $resultline "Completed reading $counter items into address-list $listname." } 
 :put $resultline
 :if ($nolog = null) do={:log warning $resultline }
 :if ($counter > 0) do={:do {/ip firewall address-list remove [find where list=("backup".$listname)]} on-error={} } else={
 :do {:foreach i in=[/ip firewall address-list find list=("backup".$listname)] do={/ip firewall address-list set list=$listname $i }} on-error={}
 :put "Restoring backup list: $listname" 
 :if ($nolog = null) do={:log warning "Restoring backup list: $listname"}
 }; # if counter restore on failure and remove on success
}; # do
$update url=https://www.spamhaus.org/drop/drop.txt listname=spamhaus delimiter=("\_") timeout=1d
}

# To be used configline settings:
# url=	        https://name.of.the.list
# listname=	name of address-list

# Optinal settings
# timeout=	the time the entry should be active. If omited then static entries are created.
# comment=	puts this comment on every line in the choosen address-list (default: no comment)
# heirule=	this will select on a word on each line if to import or not (default: no heirule)
# noerase=	any value, then the current list is not erased (default: erase)
# ownPosix=	allow to enter a onw regEX posix to be used (not ative at this moment)
# nolog=        any value, then don't write to the log (default: writing to log)
Screen Shot 2024-03-29 at 7.47.47 PM.png
You do not have the required permissions to view the files attached to this post.
 
kevinds
Long time Member
Long time Member
Posts: 657
Joined: Wed Jan 14, 2015 8:41 am

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Sat Mar 30, 2024 2:02 am

Greetings kevinds. Unfortunately I've tried that. It starts the download, then script just stops. I'm testing it on its own, the only one in the script & fails.
Have you tried running it from the CLI?
 
MTNick
Member Candidate
Member Candidate
Posts: 106
Joined: Fri Nov 24, 2023 6:43 am

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Sat Mar 30, 2024 2:06 am

Good call. No I didn't. But, I just did lol. I seem to believe another poster in here had same issue way above (viewtopic.php?p=1066787#p1051112 in this thread as I do below. Same script as well

Starting import of address-list: spamhaus
Conditional deleting all entries in address-list: spamhaus
invalid value of "to", must be integer
 
msatter
Forum Guru
Forum Guru
Posts: 2941
Joined: Tue Feb 18, 2014 12:56 am
Location: Netherlands / Nīderlande

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Sun Mar 31, 2024 4:26 pm

Not chunking the download anymore, instead read and chunking the local file after downloaded first: viewtopic.php?p=1067023#p1067023
 
User avatar
CoMMyz
Frequent Visitor
Frequent Visitor
Posts: 64
Joined: Fri Dec 04, 2015 10:56 pm

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Wed May 08, 2024 2:42 am

Can anyone confirm if the script is working for v6 list ?
https://www.spamhaus.org/drop/dropv6.txt
 
User avatar
vdias
newbie
Posts: 28
Joined: Sat Apr 14, 2012 12:09 am

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Sat May 25, 2024 2:18 pm

Completely lost on this post...

Which is the latest script version?
 
msatter
Forum Guru
Forum Guru
Posts: 2941
Joined: Tue Feb 18, 2014 12:56 am
Location: Netherlands / Nīderlande

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Sat May 25, 2024 3:01 pm

Count posts back from yours. Like...one...two and then click the link.

Good luck with that.
 
MTNick
Member Candidate
Member Candidate
Posts: 106
Joined: Fri Nov 24, 2023 6:43 am

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Tue May 28, 2024 2:40 am

Completely lost on this post...

Which is the latest script version?

msatter provided a script here viewtopic.php?p=1067023#p1067224. Works great & has full logging, every step it takes is logged, including the amount of addresses. Backs up current list just in case & adds it back if there's a failure (I've had a failure, it did what it was supposed to). List is downloaded locally & deleted once addresses are added. Read a few posts up on the linked page. msatter explains the protections added. Beautiful script
 
User avatar
jvanhambelgium
Forum Guru
Forum Guru
Posts: 1114
Joined: Thu Jul 14, 2016 9:29 pm
Location: Belgium

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Tue May 28, 2024 8:45 am

What would it take to be able to use the same $listname across different downloads ?
I prefer working with 1 big list, and in the comments-section I provide info about the origin of that list.

$update url=("https://" . "iplists.firehol.org/files/firehol_level3.netset") listname=Dynamic-Blacklist description="Firehol-Level3" delimeter=("\n") timeout=1d
$update url=("https://" . "iplists.firehol.org/files/firehol_level2.netset") listname=Dynamic-Blacklist description="Firehol-Level2" delimeter=("\n") timeout=1d

However with the current script, it flushes completely $listname after each run so there is no concatenation IF you provide a generic "listname" which is the same across 2 of 10 download-sources.

The current script I use does adhere that logic and I post it below as reference.
At present that consolidates about 25.000 entries in my list.

:global readfile do={
:local url $1
:local thefile ""
:local filesize ([/tool fetch url=$url as-value output=none]->"downloaded")
:local maxsize 64512 ; # is the maximum supported readable size of a block from a file
:local start 0
:local end ($maxsize - 1)
:local partnumber ($filesize / ($maxsize / 1024))
:local reminder ($filesize % ($maxsize / 1024))
:if ($reminder > 0) do={ :set partnumber ($partnumber + 1) }
:for x from=1 to=$partnumber step=1 do={
:set thefile ($thefile . ([/tool fetch url=$url http-header-field="Range: bytes=$start-$end" as-value output=user]->"data"))
:set start ($start + $maxsize)
:set end ($end + $maxsize)
}
:return $thefile
}

{
/ip firewall address-list
:local update do={
:global readfile
:put "Starting import of address-list: $listname"
:put "Deleting all Dynamic enties in address-list: $listname"
:if (heirule != null) do={:put "Using as extra filtering: $heirule"}
:if ($heirule = null) do={:set $heirule "."}
:local n 0; # counter

# remove the current list completely
:do { /ip firewall address-list remove [find where comment=$description dynamic]} on-error={};
### line replaced ### :local data ([:tool fetch url=$url output=user as-value]->"data")
:local data [$readfile $url]
:put "Imported file length $[:len $data] bytes"
:while ([:len $data]!=0) do={
:local line [:pick $data 0 [:find $data "\n"]]; # create only once and checked twice as local variable
:if ($line~"^[0-9]{1,3}\\.[0-9]{1,3}\\.[0-9]{1,3}\\.[0-9]{1,3}" && $line~heirule) do={
:set $n ($n+1)
:do {add list=$listname address=[:pick $data 0 [:find $data $delimiter]] comment=$description timeout=$timeout} on-error={};
}; # if IP address && extra filter if present
:set data [:pick $data ([:find $data "\n"]+1) [:len $data]]; # removes the just added IP from the data array
}; # while
:put "Completed importing $listname added/replacing $n lines."
}; # do
$update url=("https://" . "lists.blocklist.de/lists/all.txt") delimiter=("\n") listname=Dynamic-Blacklist description=BlockDE timeout=1d
$update url=("https://" . "www.dshield.org/block.txt") delimiter=("\t") listname=Dynamic-Blacklist description=DShield timeout=1d
$update url=("https://" . "view.sentinel.turris.cz/greylist-data/greylist-latest.csv") delimiter="," listname=Dynamic-Blacklist description=GreyList timeout=1d heirule=http|smtp
$update url=("https://" . "www.spamhaus.org/drop/drop.txt") delimiter=" ; " listname=Dynamic-Blacklist description=SpamHaus timeout=1d
$update url=("https://" . "sslbl.abuse.ch/blacklist/sslipblacklist.txt") delimiter=("\r") listname=Dynamic-Blacklist description=SSLBL timeout=1d
$update url=("https://" . "www.spamhaus.org/drop/drop.txt") listname=Dynamic-Blacklist description="Spamhaus DROP" delimiter=("\_") timeout=1d
$update url=("https://" . "www.spamhaus.org/drop/edrop.txt") istname=Dynamic-Blacklist description="Spamhaus EDROP" delimiter=("\_") timeout=1d
$update url=("https://" . "sslbl.abuse.ch/blacklist/sslipblacklist.txt") listname=Dynamic-Blacklist description="Abuse.ch SSLBL" delimiter=("\r") timeout=1d
$update url=("https://" . "opendbl.net/lists/etknown.list") listname=Dynamic-Blacklist description="Compromised Hosts" delimiter=("\n") timeout=1d
$update url=("https://" . "opendbl.net/lists/bruteforce.list") listname=Dynamic-Blacklist description="Bruteforce List" delimiter=("\n") timeout=1d
$update url=("https://" . "opendbl.net/lists/talos.list") listname=Dynamic-Blacklist description="Cisco Talos" delimiter=("\n") timeout=1d
$update url=("https://" . "opendbl.net/lists/blocklistde-all.list") listname=Dynamic-Blacklist description="Blocklist.DE All" delimiter=("\n") timeout=1d
$update url=("https://" . "raw.githubusercontent.com/ktsaou/blocklist-ipsets/master/firehol_level1.netset") listname=Dynamic-Blacklist description="Firehol-Level1" delimeter=("\n") timeout=1d
$update url=("https://" . "iplists.firehol.org/files/firehol_level3.netset") listname=Dynamic-Blacklist description="Firehol-Level3" delimeter=("\n") timeout=1d
$update url=("https://" . "iplists.firehol.org/files/firehol_level2.netset") listname=Dynamic-Blacklist description="Firehol-Level2" delimeter=("\n") timeout=1d
 
Krusty
Frequent Visitor
Frequent Visitor
Posts: 76
Joined: Fri May 02, 2008 11:14 pm

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Thu Jul 18, 2024 10:06 am

afrer a while and little bit tweaking, here is mine

it will create entry "blacklist" with corresponding comment and will put all the addresses found in lists below and they will live for 24:30:00 h, so its ok tu run it once in 1d (and after reboot)
:global readfile do={
:local url $1
:local thefile ""
:local filesize ([/tool fetch url=$url as-value output=none]->"downloaded")
:local maxsize 64512 ; # is the maximum supported readable size of a block from a file
:local start 0
:local end ($maxsize - 1)
:local partnumber ($filesize / ($maxsize / 1024))
:local reminder ($filesize % ($maxsize / 1024))
:if ($reminder > 0) do={ :set partnumber ($partnumber + 1) }
:for x from=1 to=$partnumber step=1 do={
:set thefile ($thefile . ([/tool fetch url=$url http-header-field="Range: bytes=$start-$end" as-value output=user]->"data"))
:set start ($start + $maxsize)
:set end ($end + $maxsize)
}
:return $thefile
}

ip firewall address-list
:local update do={
:global readfile
:do {

### line replaced ### :local data ([:tool fetch url=$url output=user as-value]->"data")
:local data [$readfile $url]

# remove the current list completely
remove [find list=blacklist comment=$description]

# do the magic
:while ([:len $data]!=0) do={
:local line [:pick $data 0 [:find $data "\n"]];
:if ([:pick $data 0 [:find $data "\n"]]~"^[0-9]{1,3}\\.[0-9]{1,3}\\.[0-9]{1,3}\\.[0-9]{1,3}") do={
:do {add list=blacklist address=([:pick $data 0 [:find $data $delimiter]].$cidr) comment=$description timeout=24:30:00} on-error={}
}
:set data [:pick $data ([:find $data "\n"]+1) [:len $data]]
}
} on-error={:log warning "Address list <$description> update failed"}
}

# lists to download
$update url=https://feeds.dshield.org/block.txt description="DShield_block" delimiter=("\t") cidr=/24
$update url=https://feeds.dshield.org/top10-2.txt description="DShield_top10_PortScan" delimiter=("\t")
$update url=https://www.spamhaus.org/drop/drop.txt description="Spamhaus_DROP" delimiter=" ; "
$update url=https://sslbl.abuse.ch/blacklist/sslipblacklist.txt description="Abuse.ch_SSLBL" delimiter=("\r")
$update url=https://lists.blocklist.de/lists/bruteforcelogin.txt description="Blocklist.de_bruteforcers" delimiter=("\n")
$update url=https://lists.blocklist.de/lists/sip.txt description="Blocklist.de_SIP" delimiter=("\n")
$update url=https://lists.blocklist.de/lists/ftp.txt description="Blocklist.de_FTP" delimiter=("\n")
$update url=https://lists.blocklist.de/lists/imap.txt description="Blocklist.de_IMAP" delimiter=("\n")
$update url=https://lists.blocklist.de/lists/strongips.txt description="Blocklist.de_strongips" delimiter=("\n")
$update url=https://lists.blocklist.de/lists/bots.txt description="Blocklist.de_bots" delimiter=("\n")
$update url=https://raw.githubusercontent.com/firehol/blocklist-ipsets/master/normshield_all_bruteforce.ipset description="Normshield_all_bruteforce" delimiter=("\n")
$update url=https://raw.githubusercontent.com/firehol/blocklist-ipsets/master/normshield_all_attack.ipset description="Normshield_all_attack" delimiter=("\n")
$update url=https://raw.githubusercontent.com/firehol/blocklist-ipsets/master/normshield_all_wormscan.ipset description="Normshield_all_wormscan" delimiter=("\n")
$update url=https://raw.githubusercontent.com/firehol/blocklist-ipsets/master/normshield_high_attack.ipset description="Normshield_high_attack" delimiter=("\n")
$update url=https://raw.githubusercontent.com/firehol/blocklist-ipsets/master/normshield_high_bruteforce.ipset description="Normshield_high_bruteforce" delimiter=("\n")
$update url=https://view.sentinel.turris.cz/greylist-data/greylist-latest.csv description="Turris_Sentinel" delimiter=","
$update url=https://opendbl.net/lists/etknown.list description="Compromised_Hosts" delimiter=("\n")
$update url=https://opendbl.net/lists/bruteforce.list description="Bruteforce_List" delimiter=("\n")
$update url=https://opendbl.net/lists/talos.list description="Cisco_Talos" delimiter=("\n")
$update url=https://opendbl.net/lists/blocklistde-all.list description="Blocklist.DE All" delimiter=("\n")
 
elico
Member Candidate
Member Candidate
Posts: 158
Joined: Mon Nov 07, 2016 3:23 am

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Fri Jul 19, 2024 1:33 pm

I am wondering if it's better to use a Routing daemon to blackhole the list traffic compared to using the FW address list for that.
 
User avatar
jvanhambelgium
Forum Guru
Forum Guru
Posts: 1114
Joined: Thu Jul 14, 2016 9:29 pm
Location: Belgium

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Fri Jul 19, 2024 2:44 pm

I think that is going to burn more cpu-cycles.
I used these blacklists for both INBOUND & OUTBOUND blocking.

For egress, you could easily blackhole each of these prefixes to "nowhere"
But what about ingress traffic coming FROM these IP's ? Then we are talking about some policy-based-routing in order to blackhole them ?

Apart from that, by using the FW-filter, I generate a LOG-statement and this is pushed to Splunk for some stats etc,
If I have internal hosts contacting any of the IP's on the blacklist I want to know about that!
External packets coming from any of the prefixes on the lists of not THAT interesting.
 
eXtremer
Frequent Visitor
Frequent Visitor
Posts: 95
Joined: Fri Nov 26, 2010 10:33 am

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Mon Aug 05, 2024 9:03 am

Hello. I have not been able to download the firehol list for several days, I get the "failure: connection timeout", all other lists are importing without an issue. I get the error below but If I open the link in a browser I can download it without an issue. Some changes did occur the firehol list ?

$update url=https://iplists.firehol.org/files/firehol_level1.netset listname=firehol_level1 delimiter=("\n") timeout=1d


Error:
[admin@MikroTik] > system script run blacklist
Starting import of address-list: firehol_level1
Conditional deleting all entries in address-list: firehol_level1
failure: connection timeout
[admin@MikroTik] > 

Using this script:
{
/ip firewall address-list
:local update do={
 :put "Starting import of address-list: $listname"
 :if ($nolog = null) do={:log warning "Starting import of address-list: $listname"}
 
 :local displayed true
 :local maxretry 3
 :local retrywaitingtime 120s
 :local retryflag true
 :for retry from=1 to=$maxretry step=1 do={
  :if (retryflag) do={ :set $retryflag false; :set $sounter 0
  :if (retry > 1) do={
   :put "Source file changed. Retring after a $retrywaitingtime wait..."
   :if ($nolog = null) do={:log warning "Source file changed. Retring after a $retrywaitingtime wait..."}
   :delay $retrywaitingtime  }
  	
  :local fetchResult [/tool fetch url=$url keep-result=no as-value]
  :local filesize ($fetchResult->"total")
  :local downsize ($fetchResult->"downloaded") 
  :if ($filesize = 0 && $downsize > 0) do={ :set $filesize $downsize}

  :local start 0
  :local maxsize 64000;	        # reqeusted chunk size
  :local end ($maxsize - 1);	# because start is zero the maxsize has to be reduced by one
  :local partnumber	 ($filesize / ($maxsize / 1024)); # how many chunk are maxsize
  :local remainder	 ($filesize % ($maxsize / 1024)); # the last partly chunk 
  :if ($remainder > 0)    do={ :set $partnumber ($partnumber + 1) }; # total number of chunks
  :if ($heirule != null) do={:put "Using as extra filtering: $heirule"} else={:set $heirule "."}
 # remove the current list completely if "erase" is not present (default setting)
  :if ($noerase = null) do={  
   :if ($timeout = null) do={:set $timeout 00:00:00; :do {:foreach i in=[/ip firewall address-list find list=$listname] do={/ip firewall address-list set list=("backup".$listname) $i }} on-error={} } else={
   :do {:foreach i in=[/ip firewall address-list find list=$listname dynamic] do={/ip firewall address-list set list=("backup".$listname) $i }} on-error={} };                
   :put ("Conditional deleting all".$dynamic." entries in address-list: $listname")
   :if ($nolog = null) do={:log warning ("Conditional deleting all".$dynamic." entries in address-list: $listname")}
  } else={:put "Entries not conditional deleted in address-list: $listname"}; # ENDIF ERASE
 :for x from=1 to=$partnumber step=1 do={
   # get filesize to be compared to the orignal one and if changed then retry
   :local comparesize ([/tool fetch url=$url keep-result=no as-value]->"total")
   :if ($comparesize = 0 && $downsize > 0) do={ :set $comparesize $downsize}
   
   # fetching the chunks from the webserver when the size of the source file has not changed
   # empty array when the source file changed. No processing is done till the next complete retry
   :if ($comparesize = $filesize) do={:set $data ([:tool fetch url=$url http-header-field="Range: bytes=$start-$end" output=user as-value]->"data")} else={:set $data [:toarray ""]; :set $retryflag true}
     #:if ($ownposix = null) do={
  # determining the used delimiter in the list, when not provided in the config
   # this only run once and so the impact on the import time is low
    :local ipv4Posix	  "^[0-9]{1,3}\\.[0-9]{1,3}\\.[0-9]{1,3}\\.[0-9]{1,3}"
    :local ipv4rangePosix "^[0-9]{1,3}\\.[0-9]{1,3}\\.[0-9]{1,3}\\.[0-9]{1,3}/[0-9]{1,2}"
    :local domainPosix	  "^.+\\.[a-z.]{2,7}"
    :local sdata $data;
   # removes any lines at the top of the file that could interfere with finding the correct posix. Setting remarksign is needed
    :while ([:pick $sdata 0 1] = $remarksign) do={ :set $sdata [:pick $sdata ([:find $sdata "\n"]+1) [:len $sdata]] }    
    :while ([:len $sdata]!=0 && $delimiter = null) do={ # The check on length of $sdata is for if no delimiter is found.   
       	:local sline [:pick $sdata 0 [:find $sdata "\n"]]; :local slen [:len $sline];
       	# set posix depending of type of data used in the list
       	:if ($sline ~ $ipv4Posix)	    do={:set $posix $ipv4Posix;	     :set $iden "List identified as a IPv4 list"}
       	:if ($sline ~ $ipv4rangePosix)	do={:set $posix $ipv4rangePosix; :set $iden "List identified as a IPv4 with ranges list"}
       	:if ($sline ~ $domainPosix)	    do={:set $posix $domainPosix;	 :set $iden "List identified as a domain list"}
       	:if ($sline ~ $posix) do={:put $iden}
      	:if ($sline ~ $posix) do={ # only explore the line if there is a match at the start of the line.
	      :do {:if ([:pick $sline 0 ($slen-$send)] ~ ($posix."\$") || $send > $slen) do={
	        :set $delimiter [:pick $sline ($slen-$send) ($slen-($send-1))]; :set $result true} else={:set $send ($send+1)}  
             :if ($result) do={ :set  $extra [:pick $sline ($slen-$send) ($slen-($send-1))]
              :if ( $extra = " " )   do={ :set $delimiter [:pick $sline ($slen-$send) ($slen-($send-2))] }
              :if ( $extra = "  " )  do={ :set $delimiter [:pick $sline ($slen-$send) ($slen-($send-3))] }
              :if ( $extra = "   " ) do={ :set $delimiter [:pick $sline ($slen-$send) ($slen-($send-4))] }
             }; # EndIf result
	      } while (!$result); # EndDoWhile
	    }; #IF sline posix
	:set $sdata [:pick $sdata ([:find $sdata "\n"]+1) [:len $sdata]]; # cut off the already searched lines
	:if ($delimiter != null) do={:local sdata [:toarray ""]} ; #Clearing sdata array ending the WhileDo loop
    }; #WHILE END $sdata
    :local sdata [:toarray ""]
   :if ([:len $delimiter] = 0) do={ :set $delimiter "\n"; :set $delimiterShow "New Line" } else={ :set $delimiterShow $delimiter }; # when empty use NewLine 20220529	
   #} else={:put "User defind Posix: $ownposix"; :set $posix $ownposix } ; # ENDIF ownposix = null
   :if ($delimiter != null && $displayed ) do={:set $displayed false; :put "Using config provided delimiter: \"$delimiterShow\""}
   :if ($posix = null) do={:set $posix "."}; # Use a match all posix if nothing is defined or found 
   :if (!retryflag) do={:put "Reading Part: $x $start - $end"}   
   :if ($timeout = null) do={:local timeout 00:00:00}; # if no timeout is defined make it a static entry.    
   # Only remove the first line only if you are not at the start of list
   
:while ( [:pick $data 0 1] = $remarksign) do={ :set $data [:pick $data ([:find $data "\n"]+1) [:len $data]] }; # removes the invalid line (Spamhaus) 
   
   :if ($start > 0) do={:set $data [:pick $data ([:find $data "\n"]+1) [:len $data]]}
     :while ([:len $data]!=0) do={
       :local line [:pick $data 0 [:find $data "\n"]]; # create only once and checked twice as local variable
       :if ( $line ~ $posix && $line~heirule) do={    
        :do {add list=$listname address=[:pick $data 0 [:find $data $delimiter]] comment=$comment timeout=$timeout; :set $counter ($counter + 1)} on-error={}; # on error avoids any panics        
       }; # if IP address && extra filter if present
      :set $data [:pick $data ([:find $data "\n"]+1) [:len $data]]; # removes the just added IP from the data array
      # Cut of the end of the chunks by removing the last lines...very dirty but it works
      :if ([:len $data] < 256) do={:set $data [:toarray ""]}    
     }; # while

  :set $start (($start-512) + $maxsize); # shifts the subquential start back by 512  
  :set $end (($end-512) + $maxsize); # shift the subquential ends back by 512 to keep the 
  }; # if retryflag
 }; #do for x
 
}; # for retry
 :if ($counter < 1) do={:set $resultline "Import was NOT successfull! Check if the list $listname is still being maintained."} else={:set $resultline "Completed reading $counter items into address-list $listname." } 
 :put $resultline
 :if ($nolog = null) do={:log warning $resultline }
 :if ($counter > 0) do={:do {/ip firewall address-list remove [find where list=("backup".$listname)]} on-error={} } else={
 :do {:foreach i in=[/ip firewall address-list find list=("backup".$listname)] do={/ip firewall address-list set list=$listname $i }} on-error={}
 :put "Restoring backup list: $listname" 
 :if ($nolog = null) do={:log warning "Restoring backup list: $listname"}
 }; # if counter restore on failure and remove on success
}; # do
$update url=https://iplists.firehol.org/files/firehol_level1.netset listname=firehol_level1 delimiter=("\n") timeout=1d nolog=1
$update url=https://iplists.firehol.org/files/firehol_level2.netset listname=firehol_level2 delimiter=("\n") timeout=1d nolog=1
$update url=https://view.sentinel.turris.cz/greylist-data/greylist-latest.csv listname=view.sentinel.turris.cz delimiter=, timeout=1d nolog=1
$update url=https://check.torproject.org/torbulkexitlist listname=tor_exit_list delimiter=("\n") timeout=1d nolog=1
}

#$update url=https://raw.githubusercontent.com/ktsaou/blocklist-ipsets/master/firehol_webserver.netset listname=firehol_webserver delimiter=("\n") timeout=1d nolog=1
#$update url=https://check.torproject.org/torbulkexitlist listname=tor_exit_list delimiter=("\n") timeout=1d nolog=1
#$update urlhttps://iplists.firehol.org/files/dshield.netset listname=DShield delimiter=("\n") timeout=1d nolog=1
# To be used configline settings:
# url=	        https://name.of.the.list
# listname=	name of address-list

# Optinal settings
# timeout=	the time the entry should be active. If omited then static entries are created.
# comment=	puts this comment on every line in the choosen address-list (default: no comment)
# heirule=	this will select on a word on each line if to import or not (default: no heirule)
# noerase=	any value, then the current list is not erased (default: erase)
# ownPosix=	allow to enter a onw regEX posix to be used (not ative at this moment)
# nolog=        any value, then don't write to the log (default: writing to log)
 
eXtremer
Frequent Visitor
Frequent Visitor
Posts: 95
Joined: Fri Nov 26, 2010 10:33 am

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Mon Aug 05, 2024 9:12 am

 
User avatar
mgsichkar
just joined
Posts: 1
Joined: Sat Oct 21, 2023 9:54 am

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Thu Sep 19, 2024 5:26 pm

Hello!
The script doesn't work because Cloudflare returns
downloaded=0
Hello!
The script doesn't work because Cloudflare returns
downloaded=0
total - empty
[it@vrouter-02] > /tool fetch url=https://www.cloudflare.com/ips-v4
      status: finished
  downloaded: 0KiBC-z pause]
    duration: 1s

[it@vrouter-02] > /file/print list 
 # NAME                                TYPE             SIZE CREATION-TIME       
 0 skins                               directory             2024-05-12 14:31:38 
 1 ips-v4                              file              230 2024-09-19 17:19:10 

[it@vrouter-02] > /file/print detail 
 0 name=skins type=directory creation-time=2024-05-12 14:31:38 

 1 name=ips-v4 type=file size=230 creation-time=2024-09-19 17:19:10 
   contents=
     173.245.48.0/20
     103.21.244.0/22
     103.22.200.0/22
     103.31.4.0/22
     141.101.64.0/18
     108.162.192.0/18
     190.93.240.0/20
     188.114.96.0/20
     197.234.240.0/22
     198.41.128.0/17
     162.158.0.0/15
     104.16.0.0/13
     104.24.0.0/14
     172.64.0.0/13
     131.0.72.0/22
[it@vrouter-02] >
Help me decide!
 
User avatar
LAYERWEB
just joined
Posts: 10
Joined: Thu Nov 14, 2024 1:40 am
Contact:

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Thu Nov 14, 2024 1:46 am

Hello!
In order not to overload, you can achieve this automation by just downloading and importing instead of having the entire process done on routeros.
Last edited by chechito on Fri Nov 15, 2024 8:33 am, edited 1 time in total.
Reason: edit linked content
 
User avatar
rextended
Forum Guru
Forum Guru
Posts: 12558
Joined: Tue Feb 25, 2014 12:49 pm
Location: Italy
Contact:

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Thu Nov 14, 2024 2:09 pm

💀⚠️CRITICAL: Never trust who provides scripts containing "/import" from "/tool fetch" from external sources.
viewtopic.php?t=203733


I can only imagine how safe LAYERWEB customers must be if the people who (allege) manage their machines make such egregious security errors,
as if they didn't even know how the things they use work...

Absolutely bulls…it and a breach of security.

You should never import from third-party sites, which maybe if someone, even with the good will of the author,
violates the account, can do what he wants with all the i…s who import the list of IPs as instructions of RouterOS instead of DSV lists...

Or maybe it's the author himself who has commercial interests in reselling as BOT all the RouterBOARDs that import the IP list as instructions, once it has reached a certain number of victims...

Nothing prevents ertugrulturan (who passes himself off as LAYERWEB) from adding /system reset etc. to one line among many, or from reconfiguring the router to become a BOT, or many other things...
 
User avatar
LAYERWEB
just joined
Posts: 10
Joined: Thu Nov 14, 2024 1:40 am
Contact:

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Thu Nov 14, 2024 6:08 pm

Hello rextended,
The system on GitHub is displayed as a completely open source. Instead of always importing, "Scheduler" can provide this process by downloading one time. There is no coercion in any way, but we offer it as a more effective solution in terms of performance. In the services we provide, there is absolutely a negative situation in terms of security. Please keep your personal comments to yourself.
Everything is listed in an autonomous and completely open source on Github.
"If SSL is available, you don't have to be afraid of the man attack in the middle."
Last edited by chechito on Fri Nov 15, 2024 8:34 am, edited 1 time in total.
Reason: edit linked content
 
jaclaz
Forum Guru
Forum Guru
Posts: 1989
Joined: Tue Oct 03, 2023 4:21 pm

Re: Address lists downloader (DShield, Spamhaus DROP/EDROP, etc)

Thu Nov 14, 2024 6:39 pm

The raised issue is not related to MITM attacks, possible attack vector is the following, whenever you "blindly" trust a third party and use a given external address/domain on which you have not full control:
1) someone (in perfect good faith) provides a service of some kind
2) you connect to it and get from it something (which is good, useful and what not)
3) everything is fine and works nicely
4) then, one day, either:
4.a) the good guys setting up the service loose control of the site (for *whatever* reasons)
or
4.b) the contents of the site/service are replaced with malicious ones without the good guys noticing it (at all or in a timely manner)

Something hosted on github may be more safe for both possibilities #4.a and #4.b when compared to a "normal" domain (that can be bought/sold/expire and is likely to have worse access security), but it is not - in principle - failsafe.

And we did not even take into consideration the possibility that someone builds intentionally a perfectly good service/site with the intention - since the beginning - to leverage its popularity for *whatever* nefarious action in 3 or 6 or 12 months time.

Who is online

Users browsing this forum: No registered users and 5 guests