From mailforlen at yahoo.com Fri Aug 5 01:58:27 2011 From: mailforlen at yahoo.com (Len Lavens) Date: Fri, 5 Aug 2011 01:58:27 -0700 (PDT) Subject: [Netalyzr] did you know I can find all the results by IP address in Google Message-ID: <1312534707.30872.YahooMailNeo@web39311.mail.mud.yahoo.com> belsec.skynetblogs.be Yes Google has indexed all the results it has found during its visits and your robot.txt doesn't probably exclude the results so now all that information is public and if you would like to ask Google also to delete all the results from its cache when you have installed your robot.txt and your Google accounts just to be sure no hacker sees all the very technical and useful information mailforlen -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ICSI.Berkeley.EDU/pipermail/netalyzr/attachments/20110805/8ced0205/attachment.html From christian at icir.org Fri Aug 5 10:36:36 2011 From: christian at icir.org (Christian Kreibich) Date: Fri, 05 Aug 2011 10:36:36 -0700 Subject: [Netalyzr] did you know I can find all the results by IP address in Google In-Reply-To: <1312534707.30872.YahooMailNeo@web39311.mail.mud.yahoo.com> References: <1312534707.30872.YahooMailNeo@web39311.mail.mud.yahoo.com> Message-ID: <4E3C2A24.5030502@icir.org> On 08/05/2011 01:58 AM, Len Lavens wrote: > belsec.skynetblogs.be > > > Yes Google has indexed all the results it has found during its visits > and your robot.txt doesn't probably exclude the results > so now all that information is public That's right, thanks for the hint. We're actually in the process of putting a place a decent robots.txt file. Note however that Google has only found those additional the session summaries via directory traversal. Best, Christian From nweaver at ICSI.Berkeley.EDU Fri Aug 5 10:58:15 2011 From: nweaver at ICSI.Berkeley.EDU (Nicholas Weaver) Date: Fri, 5 Aug 2011 10:58:15 -0700 Subject: [Netalyzr] did you know I can find all the results by IP address in Google In-Reply-To: <1312534707.30872.YahooMailNeo@web39311.mail.mud.yahoo.com> References: <1312534707.30872.YahooMailNeo@web39311.mail.mud.yahoo.com> Message-ID: <25A38B59-8B59-45F5-A865-72F96BDD8816@icsi.berkeley.edu> More specifically, the ones Google has indexed are summary reports which people have posted links to in a place where Google has crawled. Almost all are League of Legends users, where the advanced network debugging instructions are specifically for users to publically post Netalyzr links in the forums. Thus even with a robots.txt (which we are going to add), these will still be discoverable by searching for URLs pointing TO netalyzr. On Aug 5, 2011, at 1:58 AM, Len Lavens wrote: > belsec.skynetblogs.be > > Yes Google has indexed all the results it has found during its visits > and your robot.txt doesn't probably exclude the results > so now all that information is public > > and if you would like to ask Google also to delete all the results from its cache > when you have installed your robot.txt and your Google accounts > just to be sure no hacker sees all the very technical and useful information > > mailforlen > _______________________________________________ > Netalyzr mailing list > Netalyzr at mailman.ICSI.Berkeley.EDU > http://mailman.ICSI.Berkeley.EDU/mailman/listinfo/netalyzr From christian at icir.org Fri Aug 5 11:05:38 2011 From: christian at icir.org (Christian Kreibich) Date: Fri, 05 Aug 2011 11:05:38 -0700 Subject: [Netalyzr] did you know I can find all the results by IP address in Google In-Reply-To: <4E3C2A24.5030502@icir.org> References: <1312534707.30872.YahooMailNeo@web39311.mail.mud.yahoo.com> <4E3C2A24.5030502@icir.org> Message-ID: <4E3C30F2.8020207@icir.org> On 08/05/2011 10:36 AM, Christian Kreibich wrote: > That's right, thanks for the hint. We're actually in the process of > putting a place a decent robots.txt file. Note however that Google has > only found those additional the session summaries via directory traversal. Yikes, it seems I completely butchered this message before sending, apologies. I meant to say that Google has only found those sessions because they were posted in public forums, as Nick has pointed out, and that there is *no* way to find additional session summaries via directory traversal. Sorry for the confusion, Christian From mailforlen at yahoo.com Sat Aug 6 11:15:19 2011 From: mailforlen at yahoo.com (Len Lavens) Date: Sat, 6 Aug 2011 11:15:19 -0700 (PDT) Subject: [Netalyzr] did you know I can find all the results by IP address in Google In-Reply-To: <25A38B59-8B59-45F5-A865-72F96BDD8816@icsi.berkeley.edu> References: <1312534707.30872.YahooMailNeo@web39311.mail.mud.yahoo.com> <25A38B59-8B59-45F5-A865-72F96BDD8816@icsi.berkeley.edu> Message-ID: <1312654519.89563.YahooMailNeo@web39305.mail.mud.yahoo.com> this is quite surprising first it seems nobody really read the posts about this software and the essential privace and security questions which weren't traited it seems you were - as scientists -? so absorbed in your project that you forgot the wider impact and possibilities of your project the technical questions pushed the security and privacyproblems into the background first the google searchterm didn't show links to reports on other forums but on your site itself the google searchterm was site:yoursite and has shown all results on your site that these results are also posted on other sites makes the problem even greater so that response is a mistake and its only purpose is to neglect your responsability you should put however an article in your conditions - that you don't have (in a superlegalistic country like the US?) that this information shouldn't be published on the internet because it has too much dangerous security information and that you aren't responsable etc.... three sideremarks on the question that a robot.txt will be made * this changes nothing about the fact that the security information in the reports can be extremely dangerous * this changes nothing about the fact that that information should be better protected against prying eyes * this changes nothing about the fact that the information will still be on the site for scanners and searchengines that don't respect the robot.txt secondly there are several things you should do * you should limit the tests to people who aren't behind a NAT (network) the reason is that there are other people responsable for a network and that only those people should be able to use your service you should also automatically refuse to it being used by certain organisations and firms like military, offical government and banks to be sure that you aren't being used? you should make it possilble for the networkadmins to use your service but only after registration and control and with their official emailadress and with a number of 'unresponsabilities" for you to sign for them * you should change the structure and that people sign a declaration before the test is run I don't understand why in the time of Lulzsec you take all these responsabilites without a disclaimer that is sufficient in the present situation of the internet and legal environment * and some other things I think about but it all depends do you want to do the right thing or do you still think that there is only a limited problem If you want me to help you I will but If you think that I am just a stupid kid than I will continue my research and campaign belsec.skynetblogs.be ________________________________ From: Nicholas Weaver To: Len Lavens Cc: Nicholas Weaver ; "netalyzr at mailman.ICSI.Berkeley.EDU" Sent: Friday, August 5, 2011 7:58 PM Subject: Re: [Netalyzr] did you know I can find all the results by IP address in Google More specifically, the ones Google has indexed are summary reports which people have posted links to in a place where Google has crawled.? Almost all are League of Legends users, where the advanced network debugging instructions are specifically for users to publically post Netalyzr links in the forums. Thus even with a robots.txt (which we are going to add), these will still be discoverable by searching for URLs pointing TO netalyzr.? On Aug 5, 2011, at 1:58 AM, Len Lavens wrote: > belsec.skynetblogs.be > > Yes Google has indexed all the results it has found during its visits > and your robot.txt doesn't probably exclude the results > so now all that information is public > > and if you would like to ask Google also to delete all the results from its cache > when you have installed your robot.txt and your Google accounts > just to be sure no hacker sees all the very technical and useful information > > mailforlen > _______________________________________________ > Netalyzr mailing list > Netalyzr at mailman.ICSI.Berkeley.EDU > http://mailman.ICSI.Berkeley.EDU/mailman/listinfo/netalyzr -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ICSI.Berkeley.EDU/pipermail/netalyzr/attachments/20110806/8c2b8837/attachment.html From mail at sabahattin-gucukoglu.com Sat Aug 6 15:33:53 2011 From: mail at sabahattin-gucukoglu.com (Sabahattin Gucukoglu) Date: Sun, 7 Aug 2011 01:33:53 +0300 Subject: [Netalyzr] did you know I can find all the results by IP address in Google In-Reply-To: <1312654519.89563.YahooMailNeo@web39305.mail.mud.yahoo.com> References: <1312534707.30872.YahooMailNeo@web39311.mail.mud.yahoo.com> <25A38B59-8B59-45F5-A865-72F96BDD8816@icsi.berkeley.edu> <1312654519.89563.YahooMailNeo@web39305.mail.mud.yahoo.com> Message-ID: I'm not part of the Netalyzr team; I just want to respond to some of your points. On 6 Aug 2011, at 21:15, Len Lavens wrote: > first it seems nobody really read the posts about this software and the essential privace and security questions which weren't traited > it seems you were - as scientists - so absorbed in your project that you forgot the wider impact and possibilities of your project > the technical questions pushed the security and privacyproblems into the background No, I don't think that's the case. Some of us, for instance, are of the opinion that security problems (like the ones you reference in your blog post - thanks for splattering the information around without first warning the team) are still problems regardless of how much you try to obscure them. I'm one of them. I would be an irresponsible user of the Internet if the details in my reports were sufficient to provide an attacker a way into my machines or networks. And while I accept that not revealing the information is a reasonable damage limitation tool, and that not everyone will be as "Savvy" as me, it's not Netalyzr's problem beyond a reasonable consideration of the possible dangers to do anything more about it than just not create directories of such reports. > first the google searchterm didn't show links to reports on other forums but on your site itself > the google searchterm was site:yoursite and has shown all results on your site > that these results are also posted on other sites makes the problem even greater > so that response is a mistake and its only purpose is to neglect your responsability I think he means that the links are from other forums. In that case, we would expect that Netalyzr will be the target in Google. For example, I recently did that, to help my ISP debug an issue with my connection, in a public forum. > you should put however an article in your conditions - that you don't have (in a superlegalistic country like the US?) that this information shouldn't be > published on the internet because it has too much dangerous security information and that you aren't responsable etc.... I imagine that these scientists feel the same way about legalese as security researchers, i.e., not a great deal. > three sideremarks on the question that a robot.txt will be made > * this changes nothing about the fact that the security information in the reports can be extremely dangerous I question this, of course. Too many security researchers are believers in "Defense in depth", without regard for design, efficiency or pretty much anything else. They want to put computers behind walled fortresses, without regard for the security of the software running on them. They want to impair protocols by making them connection unaware, without regard for scalability. They want to encourage NAT, because goodness me, NAT is such a wonderful security tool, despite the fact that firewalls - sometimes stateless ones - are better for the job ... > * this changes nothing about the fact that that information should be better protected against prying eyes How? > * this changes nothing about the fact that the information will still be on the site for scanners and searchengines that don't respect the robot.txt But it addresses your Google search concern nicely. Would the site have been worth your time had Google turned up nothing? > secondly there are several things you should do > * you should limit the tests to people who aren't behind a NAT (network) > the reason is that there are other people responsable for a network and that only those people should be able to use your service When I asked Netalyzr if they would implement NAT-PMP testing, they very politely refused, on the grounds that it might change the configuration of somebody else's network. Isn't it thoughtful of them? Wish I'd thought of that. Bear in mind that anybody on a NAT-PMP-enabled network can open any ports to the host making requests. And I disagree that the tests done are in any way harmful, as above, so it doesn't matter if you have permission or not. It's no different to the many port scanners out there. The best you can do is say something like, "Please check with your network admin before possibly setting off all the alarms." And be assured that many corporate networks use NAT, so using NAT to decide whether or not the network is personal won't work anyway. That's not to speak of all the dumb firewall configurations which can and should be named and shamed for being quite so dumb, on corporate networks, like the ones which block oversize DNS responses. > you should also automatically refuse to it being used by certain organisations and firms like military, offical government and banks > to be sure that you aren't being used > you should make it possilble for the networkadmins to use your service but only after registration and control and with their official emailadress > and with a number of 'unresponsabilities" for you to sign for them Much as above; overkill. You think attackers will request permission before conducting surveillance? Security is *their* problem, not Netalyzr's. Besides, "Debug your Internet" isn't very helpful if it's just restricted to the admins. I suppose there's a case that banks and such should be treated with generally more care than other types, but I can't see how it could be implemented even if they wanted to. And again, it's just not warranted by the types of test, much of which reveal information rather than actually harm the network. Information exposure should be worthwhile preventing by the organisations to either block Netalyzr (or Java) or not be a problem. > * you should change the structure and that people sign a declaration before the test is run > I don't understand why in the time of Lulzsec you take all these responsabilites without a disclaimer that is sufficient in the present situation of the internet and legal environment Again, probably because it just isn't warranted. Really. What LulzSec did, mostly, was demonstrate just how bad people's security is, but because known, obvious practices were not followed. Should news coverage not have been made because the organisations were high-value targets? Of course not! > * and some other things I think about > but it all depends > do you want to do the right thing or do you still think that there is only a limited problem Netalyzr should have a robots.txt file and should ensure that all forms of directory indexing and traversal are off. And that's all. IMO. > If you want me to help you > I will > but If you think that I am just a stupid kid > than I will continue my research and campaign > belsec.skynetblogs.be You already broke the honour system of security disclosure. Why not just report it to full-disclosure right now? Cheers, Sabahattin From oldcodger1 at att.net Thu Aug 11 16:30:11 2011 From: oldcodger1 at att.net (John Estes) Date: Thu, 11 Aug 2011 16:30:11 -0700 Subject: [Netalyzr] ID 32131236-4910-d015db9b-648b-4c38-aa18 Message-ID: <4E446603.3090401@att.net> I am concerned about the "Major Abnormalities" and other red and orange coded results reported on this Netalyzr test. Are these problems with my home network (mixed Win 7, Win XP, and Win 98 computers--test was run on a Win 7 Professional 64 bit machine), or is AT&T monkeying with my DSL connection? Please advise me what steps I might take to resolve these issues. I am using OpenDNS on this computer. Thank you. John Estes From grokit at ajinfosearch.com Thu Aug 11 16:57:54 2011 From: grokit at ajinfosearch.com (Alan Dacey Sr.) Date: Thu, 11 Aug 2011 19:57:54 -0400 Subject: [Netalyzr] Incorrect warning Message-ID: <201108111957.54869.grokit@ajinfosearch.com> Hi all, I just checked out netalyzer (me likes!) and have a bug to report. I found no info on where to do it so I subscribed and am posting it to the list. If this is the wrong place then please let me know the correct way to report bugs. Overview: I got a warning that should never have shown up as such. I have an extensive /etc/hosts file that blocks many evil and/or annoying sites by redirecting them to localhost. All three of the flagged sites are in the /hosts file. Behavor: Unwarrented warnings (in red) showed up on the results. Expected action: DNS lookups of popular domains should be treated as information instead of warnings when the IP address is 127.0.0.1 (localhost) Taken from the test results: DNS lookups of popular domains (?): Warning 3 popular names have a significant anomaly. The ownership suggested by the reverse name lookup does not match our understanding of the original name. This could be caused by an error somewhere in the domain information, deliberate blocking or redirection of a site using DNS, or it could be that your ISP's DNS Server is acting as a DNS "Man-in-the-Middle". Name IP Address Reverse Name/SOA ad.doubleclick.net 127.0.0.1 localhost www.google-analytics.com 127.0.0.1 localhost partner.googleadservices.com 127.0.0.1 localhost - Alan -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.ICSI.Berkeley.EDU/pipermail/netalyzr/attachments/20110811/f8098730/attachment.html