[Netalyzr] did you know I can find all the results by IP address in Google

Sabahattin Gucukoglu mail at sabahattin-gucukoglu.com
Sat Aug 6 15:33:53 PDT 2011

I'm not part of the Netalyzr team; I just want to respond to some of your points.

On 6 Aug 2011, at 21:15, Len Lavens wrote:
> first it seems nobody really read the posts about this software and the essential privace and security questions which weren't traited
> it seems you were - as scientists -  so absorbed in your project that you forgot the wider impact and possibilities of your project
> the technical questions pushed the security and privacyproblems into the background

No, I don't think that's the case.  Some of us, for instance, are of the opinion that security problems (like the ones you reference in your blog post - thanks for splattering the information around without first warning the team) are still problems regardless of how much you try to obscure them.  I'm one of them.  I would be an irresponsible user of the Internet if the details in my reports were sufficient to provide an attacker a way into my machines or networks.  And while I accept that not revealing the information is a reasonable damage limitation tool, and that not everyone will be as "Savvy" as me, it's not Netalyzr's problem beyond a reasonable consideration of the possible dangers to do anything more about it than just not create directories of such reports.

> first the google searchterm didn't show links to reports on other forums but on your site itself
> the google searchterm was site:yoursite and has shown all results on your site 
> that these results are also posted on other sites makes the problem even greater
> so that response is a mistake and its only purpose is to neglect your responsability

I think he means that the links are from other forums.  In that case, we would expect that Netalyzr will be the target in Google.  For example, I recently did that, to help my ISP debug an issue with my connection, in a public forum.

> you should put however an article in your conditions - that you don't have (in a superlegalistic country like the US?) that this information shouldn't be 
> published on the internet because it has too much dangerous security information and that you aren't responsable etc....

I imagine that these scientists feel the same way about legalese as security researchers, i.e., not a great deal.

> three sideremarks on the question that a robot.txt will be made
> * this changes nothing about the fact that the security information in the reports can be extremely dangerous

I question this, of course.  Too many security researchers are believers in "Defense in depth", without regard for design, efficiency or pretty much anything else.  They want to put computers behind walled fortresses, without regard for the security of the software running on them.  They want to impair protocols by making them connection unaware, without regard for scalability.  They want to encourage NAT, because goodness me, NAT is such a wonderful security tool, despite the fact that firewalls - sometimes stateless ones - are better for the job ...

> * this changes nothing about the fact that that information should be better protected against prying eyes


> * this changes nothing about the fact that the information will still be on the site for scanners and searchengines that don't respect the robot.txt 

But it addresses your Google search concern nicely.  Would the site have been worth your time had Google turned up nothing?

> secondly there are several things you should do
> * you should limit the tests to people who aren't behind a NAT (network)
> the reason is that there are other people responsable for a network and that only those people should be able to use your service

When I asked Netalyzr if they would implement NAT-PMP testing, they very politely refused, on the grounds that it might change the configuration of somebody else's network.  Isn't it thoughtful of them?  Wish I'd thought of that.  Bear in mind that anybody on a NAT-PMP-enabled network can open any ports to the host making requests.  And I disagree that the tests done are in any way harmful, as above, so it doesn't matter if you have permission or not.  It's no different to the many port scanners out there.  The best you can do is say something like, "Please check with your network admin before possibly setting off all the alarms."  And be assured that many corporate networks use NAT, so using NAT to decide whether or not the network is personal won't work anyway.  That's not to speak of all the dumb firewall configurations which can and should be named and shamed for being quite so dumb, on corporate networks, like the ones which block oversize DNS responses.

> you should also automatically refuse to it being used by certain organisations and firms like military, offical government and banks 
> to be sure that you aren't being used 
> you should make it possilble for the networkadmins to use your service but only after registration and control and with their official emailadress
> and with a number of 'unresponsabilities" for you to sign for them

Much as above; overkill.  You think attackers will request permission before conducting surveillance?  Security is *their* problem, not Netalyzr's.  Besides, "Debug your Internet" isn't very helpful if it's just restricted to the admins.  I suppose there's a case that banks and such should be treated with generally more care than other types, but I can't see how it could be implemented even if they wanted to.  And again, it's just not warranted by the types of test, much of which reveal information rather than actually harm the network.  Information exposure should be worthwhile preventing by the organisations to either block Netalyzr (or Java) or not be a problem.

> * you should change the structure and that people sign a declaration before the test is run 
> I don't understand why in the time of Lulzsec you take all these responsabilites without a disclaimer that is sufficient in the present situation of the internet and legal environment

Again, probably because it just isn't warranted.  Really.  What LulzSec did, mostly, was demonstrate just how bad people's security is, but because known, obvious practices were not followed.  Should news coverage not have been made because the organisations were high-value targets?  Of course not!

> * and some other things I think about
> but it all depends 
> do you want to do the right thing or do you still think that there is only a limited problem

Netalyzr should have a robots.txt file and should ensure that all forms of directory indexing and traversal are off.  And that's all.  IMO.

> If you want me to help you
> I will 
> but If you think that I am just a stupid kid
> than I will continue my research and campaign
> belsec.skynetblogs.be

You already broke the honour system of security disclosure.  Why not just report it to full-disclosure right now?


More information about the Netalyzr mailing list