Thursday, July 30, 2015

IceFloor, a MAC OSX pf firewall front-end

IceFloor is a MAC OSX firewall graphical front-end (FOSS).

Web site: http://www.hanynet.com/icefloor/

MAC OSX is delivered with 2 firewalls:

  • ALF (Application Level Firewall). Configuration is performed via the Control Panel "System Preferences", and by default hides almost all of the gory details of the configuration to the end-user.
  • PF (Packet Filter). A network firewall of the OpenBSD project. Configuration is performed via the command line (shell), and/or via a configuration script (pf.conf).

Both can run at the same time. Note that IceFloor will not touch any of the default OSX firewalls configuration files and it runs fine on OSX 10.8 (Mountain Lion).

If your version of OSX is Mavericks, Yosemite or later, then head to the new  http://www.murusfirewall.com ; the "Lite" version is free, but a bit limited. if cost is an issue, or if you are addicted to the FOSS model, then man pfctl is your friend.

If I had to attempt a security conference like DEFCON or Black Hat with my MAC, I'd certainly add pf to the existing ALF firewalling (but the best is to use a dedicated machine for this purpose, freshly imaged, and use an OS like Tails, avoiding by all means to work on job related stuff, or private emails picking etc).






Burp Suite MAC OSX icon

I created this Burp Suite icon for my own usage and pleasure, and sharing it.
It's a mix of the Burp Suite and the greenpois0n icons (made with The GIMP).

To change the default icon for an application, see http://osxdaily.com/2013/06/04/change-icon-mac/

Burp Suite MAC OSX icon (size: 400 x 400)

Monday, May 18, 2015

PTF: The Pen-tester Framework

PTF


PTF is a Python script designed for Debian/Ubuntu (plans on expanding to more) based distributions to create a similar and familiar distribution for Penetration Testing. As pentesters, we’ve been accustom to the /pentest/ directories or our own toolsets that we want to keep up-to-date all of the time.

PTF attempts to install all of your penetration testing tools (latest and greatest), compile them, build them, and make it so that you can install/update your distribution on any machine. Everything is organized in a fashion that is cohesive to the Penetration Testing Execution Standard (PTES) and eliminates a lot of things that are hardly used. PTF simplifies installation and packaging and creates an entire pentest framework for you. Since this is a framework, you can configure and add as you see fit. We commonly see internally developed repos that you can use as well as part of this framework. It’s all up to you.

To run PTF, first check out the config/ptf.config file which contains the base location of where to install everything. By default this will install in the /pentest directory. Once you have that configured, move to running PTF by typing ./ptf (or python ptf).
This will put you in a “Metasploitesk” type shell which has a similar look and feel for consistency. Show modules, use , etc. are all accepted commands. First things first, always type help or ? to see a full list of commands.

If you want to install and/or update everything, simply do the following:

./ptf
use modules/install_update_all
run

This will install all of the tools inside of PTF. If they are already installed, this will iterate through and update everything for you automatically. You can also install or update each individual module separately just by use then run. If its already installed, it will simply update the package.
You can also show options to change information about the modules. If you want to create your own module, its simple. First, head over to the modules/ directory, inside of there are sub directories based on the Penetration Testing Execution Standard (PTES) phases. Go into those phases and look at the different modules. As soon as you add a new one, for example testing.py, it will automatically be imported next time you launch PTF. There are a few key components when looking at a module that must be completed.

In order to download PTF, head over to github or clone it.



git clone https://github.com/trustedsec/ptf

Full announcement: https://www.trustedsec.com/may-2015/new-tool-the-pentesters-framework-ptf-released/

Saturday, February 14, 2015

Opt out of global data surveillance programs like PRISMXKeyscore andTempora

https://prism-break.org
Loads of resources for many platforms. Worth the bookmark !

Saturday, February 15, 2014

MEMEX: a DARPA Project (DARPA-BAA-14-21)

MEMEX: a good idea, but it mostly depends on who will use it, and how...
It's always the same story. Human can be good and odd at the same time: nuclear research findings helped us fighting medical diseases, but also granted us with massive destruction capabilities.

MEMEX: what is all about?   


PART II: FULL TEXT OF ANNOUNCEMENT 
I. FUNDING OPPORTUNITY DESCRIPTION
The Defense Advanced Research Projects Agency (DARPA) is soliciting proposals for innovative research to maintain technological superiority in the area of content indexing and web search on the Internet. Proposed research should investigate approaches that enable revolutionary advances in science, devices, or systems. Specifically excluded is research that primarily results in evolutionary improvements to the existing state of practice.(...)
Overview
Today's web search is limited by a one-size-fits-all approach offered by web-scale commercial providers. They provide a centralized search, which has limitations in the scope of what gets indexed and the richness of available details. For example, common practice misses information in the deep web and ignores shared content across pages. Today's largely manual search process does not save sessions or allow sharing, requires nearly exact input with one at a time entry, and doesn't organize or aggregate results beyond a list of links.
The Memex program envisions a new paradigm, where one can quickly and thoroughly organize a subset of the Internet relevant to one’s interests.(...)


Well, I could simply submit my application to the DARPA telling them to look at what the NSA is able to do but it's not that simple anymore since Snowden's revelations ;-)


Why should I care?

My (main) concerns in red...

Source: the DARPA announcement is available here: https://www.fbo.gov/index?s=opportunity&mode=form&id=426485bc9531aaccba1b01ea6d4316ee

There is a PDF as well. Here are some extracts.


Technical Area 1: Domain-Specific Indexing 
Crawling should also be robust to automated counter-crawling measures, crawler bans based on robot behavior, human detection, paywalls and member-only areas, forms, dynamic and non-HTML content, etc.
Information extraction may include normalization of heterogeneous data, natural language processing for translation and entity extraction and disambiguation, image analysis for object recognition, coreference resolution, extraction of multimedia (e.g., pdf, flash, video, image), relevance determination, etc.
Technical Area 2: Domain-Specific Search 
Technical Area 2 includes the creation of a configurable domain-specific interface into web content. The domain-specific interface may include: conceptually aggregated results, e.g., a person; conceptually connected content, e.g., links for shared attributes; task relevant facets, e.g., key locations, entity movement; implicit collaboration for enriched content; explicit collaboration with shared tags; recommendations based on user model and augmented index, etc.

Erm... Maltego ?

Also, TA2 performers will work with TA1 performers on the design of a query language for directing crawlers and information extraction algorithms. A language to specify the domain, including both crawling as well as interface capability, may include concepts, sets of keywords, time delimitations, area delimitations, IP ranges, computational budgets, semi-automated feedback, iterative methods, data models, etc. Technical Area 3: Applications
Human Trafficking, especially for the commercial sex trade, is a line of business with significant web presence to attract customers and is relevant to many types of military, law enforcement, and intelligence investigations. The use of forums, chats, advertisements, job postings, hidden services, etc., continue to enable a growing industry of modern slavery. An index curated for the counter trafficking domain, including labor and sex trafficking, along with configurable interfaces for search and analysis will enable a new opportunity for military, law enforcement, legal, and intelligence actions to be taken against trafficking enterprises.
Other application domains will be considered during the life of the program, possibly including indexing and interfaces for found data, missing persons, counterfeit goods, etc.
Since technology development will be guided by end-users with operational support expertise, DARPA will engage elements of the DoD and other agencies to develop use cases and operational concepts around Human Trafficking and other domains. 
"Human Trafficking". Cool, no one can argue that it's not a legit one. I personally think the use case is relevant. Another one is money laundering, because it is very often bound to human trafficking and slavery. I suppose they are many other use cases that would benefit from such project. 
But still... What if it is used by an entity or personae with other goals in mind? (the list of "abnormal" activities is way too long to be written down here, and it depends on too many factors like culture, politics, education, regulations, etc. I told you, the list is not exhaustive)


2. Foreign Participation

Non-U.S. organizations and/or individuals may participate to the extent that such participants comply with any necessary nondisclosure agreements, security regulations, export control laws, and other governing statutes applicable under the circumstances. 
Others? Circumstances? A bit vague...
D. Other Eligibility Requirements
1. Ability to Support Classified Development, Integration and Transition

While the program itself is unclassified, interactions with end-users and potential transition partners will require Technical Area 3 performers to have access to classified information. Therefore, at the time of award, all prime proposers to Technical Area 3 must have (at a minimum) Secret facility clearance and have personnel under their Commercial and Government Entity (CAGE) code with a Secret clearance that are eligible for TS/SCI. Technical Area 3 proposers must provide their CAGE code and security point(s) of contact in their proposals. 
(...)
Proposers for Technical Areas 1 and 2 are not required to have security clearances.

Yes, this is a touchy project...


The one who will be able to really index all of the Internet, including the Dark Web will for sure be in a position to compete with substantial chances of winning any challenges (commercial, military, social).
They are legit fights, like the proposed use case of the project, or infancy protection for example, and many more. But I'm always scared of what we humans are able to do: turning something good into evil.
I am also anxious about a possible dichotomy of the Internet. 

The one who will be able to even further mangle this massive amount of data will somehow be able to achieve some forms of "prediction". 
If I am able to know, I will then be able to infer. Am I allowed to do so (regulations)?



Crawling should also be robust to automated counter-crawling measures, crawler bans based on robot behavior, human detection, paywalls and member-only areas, forms, dynamic and non-HTML content, etc.
If I am running a web server and configured this one not to be indexed (robot.txt), it is because I deliberately chose to do so ! If I am a member of a private or by-invitation only forum it is maybe because I am trying to dissociate my private and professional life. They are many other genuine examples I could mention.


  • The risk is that the Dark Web will become even darker, and with the possible event of global encryption this might be even trickier for non-aware people (read: not geek) to have access to information not under someone else scrutiny. 
  • This is raising numerous concerns, the major one being freedom of speech, and the difficulty to access different sources of information, some being controlled, and some being not. To make my mind, I like to read different newspapers; they are not necessarily from the same side of the Thames. 
  • Another one is the lack of awareness of our representatives when it comes to Information Technology. Most of them don't have a geek mindset and are therefore not able to assess nor regulate what is in the pipe in regard to Information gathering and the controls that can be applied on it. Indeed, the Internet today is (still almost) free, but far from being properly regulated. Why? Well, this is a picky topic isn't it? "Laws are like sausages - it is best not to see them being made" (Otto von Bismarck).

To sum:  


Our future will be massively interconnected, each object having its own personality (property, and methods... some OOP developers will like this); an object being a person, or a real object, tracking what they are really doing - the traces they are leaving all over the Internet over time - is to me a bit Orwellian. Nevertheless, this DARPA project is very interesting. There is a genuine need to map the entire Internet content in order to possibly infer evil behaviors and thus fight crime, avoid attacks before they happen, draw or graph interesting facts/events/ideas over time. 

As a security practitioner I see the benefits, while as a citizen I fear the misuse(s), especially since today's regulations are far from being suitable, or still being drafted.


RasKal, 2014 February the 15th.





Sunday, December 8, 2013

digpcap.py - Network Capture files mangling


If you are storing huge amount of network capture files (tcpdump output, or from similar programs), then you might be interested in trying this Python script that aims at finding "The" correct capture file(s) among all the directories that are automagically created by tcpdump when the option to write on disk is activated.

Example of such tcpdump command line:

tcpdump -s 0 -v -C 100 -W 10000  -w $dumpdir/`date +"%Y/%m/%d/"`dump

where:

  • -s 0 (to capture entire packet)
  • -v (verbose, prints more details. See tcpdump man page for additional v options, e.g. -vvv)
  • -C 100 (capture up to 100MB of packets per file)
  • - W 10000 (store up to 10000 rollover files before starting from 0 again, e.g. dump00001, dump00002, etc)
  • -w (write to file)
  • $dumpdir (the root folder of the directory tree where captures are stored, e.g. /var/log/pcap). $dumpdir is important: you will have to modify the Python script to reflect your own value (I could have this passed by command-line argument but, for my own usage, it doesn't make sense as this folder is not likely to change often: YMMV).
  • `date +"%Y/%m/%d/"`dump (e.g. if today's date is the 15th of November 2013, then it will yield 2013/11/15/dump
  • So -w $dumpdir/`date +"%Y/%m/%d/"`dump will store capture file as /var/log/pcap/2013/11/15/dump00001 and the -C switch will create dump00002 as soon as dump00001 reaches the 100MB's limit.

The script returns a list of files matching start and end time constraints provided as arguments.

Some examples of arguments the script is able to handle:

raskal$ ./digpcap.py 
usage: digpcap.py [-h] -f FROMDATE [-t TODATE] [-s SEARCH] [-v]
digpcap.py: error: argument -f/--fromdate is required

  • -f (from date) is the only mandatory argument required to avoid the above error message. It is the date you want to start searching for ... something.
  • Without -t (to date), it's defaulting to now().
  • -s is an optional argument, and it's not in use at the moment (look at Script content for the roadmap).
  • -v switches to verbose mode and therefore cannot be used to pipe the output of the script to another command like tshark (in a for loop context, see below).
  • -h ... well, usage and some examples....
Script dependencies:

- python-dateutil
- capinfos

- Update the CAP_FOLDER global variable to reflect your own setup
- CAP_FOLDER is the place where tcpdump capture files are stored.
CAP_FOLDER example (no trailing / slash at the end please): 
CAP_FOLDER = '/var/log/pcap'
The -v argument switch (-v output in blue)
raskal$ ./digpcap.py -v -f "Dec. 11, 2013 14:40:00"
Will dig None from 2013-12-11 14:40:00 to 2013-12-11 14:49:51.544586
2013/12/11/dump11007 interval: 2013-12-11 14:39:08 to 2013-12-11 14:47:22
2013/12/11/dump11008 interval: 2013-12-11 14:47:22 to 2013-12-11 14:53:51
2013/12/11/dump11007
2013/12/11/dump11008


As you can mention, the -v argument produces more output not suitable for further piping and processing. Use it to check that the script is not messing around.
Last example: searching for capture files stored between November the 4th at 11:30am and November the 5th at 3:46:50am (no -v switch this time).
raskal$ ./digpcap.py -f "Nov. 4, 2013 11:30" -t "Nov. 5, 2013 3:46:50" 
2013/11/04/dump09007
2013/11/04/dump09008
2013/11/04/dump09009
2013/11/04/dump09010
2013/11/05/dump00001
2013/11/05/dump00002
Note: at midnight, a cron job is restarting the tcpdump process thus dump##### numbering is restarting afresh.
Without -v you can further process each file one by one, your imagination is the limit.(Note: no need to enter the time in HH:MM:SS format, HH is enough) raskal$ for f in `./digpcap.py -f "Aug. 23, 2012 14" -t "Aug. 24, 2012 14"`; do echo "Look Ma, got " $f; done Look Ma, got 2012/08/23/ipv4.pcap Look Ma, got 2012/08/23/ipv6.pcap Look Ma, got 2012/08/24/fragment.pcap Look Ma, got 2012/08/24/link.pcap The script... digpcap.py (version 0.3)

Sunday, October 27, 2013

ICANN Arabic, Chinese and Cyrillic GTLDs versus pattern matching