Sunday, December 8, 2013

digpcap.py - Network Capture files mangling


If you are storing huge amount of network capture files (tcpdump output, or from similar programs), then you might be interested in trying this Python script that aims at finding "The" correct capture file(s) among all the directories that are automagically created by tcpdump when the option to write on disk is activated.

Example of such tcpdump command line:

tcpdump -s 0 -v -C 100 -W 10000  -w $dumpdir/`date +"%Y/%m/%d/"`dump

where:

  • -s 0 (to capture entire packet)
  • -v (verbose, prints more details. See tcpdump man page for additional v options, e.g. -vvv)
  • -C 100 (capture up to 100MB of packets per file)
  • - W 10000 (store up to 10000 rollover files before starting from 0 again, e.g. dump00001, dump00002, etc)
  • -w (write to file)
  • $dumpdir (the root folder of the directory tree where captures are stored, e.g. /var/log/pcap). $dumpdir is important: you will have to modify the Python script to reflect your own value (I could have this passed by command-line argument but, for my own usage, it doesn't make sense as this folder is not likely to change often: YMMV).
  • `date +"%Y/%m/%d/"`dump (e.g. if today's date is the 15th of November 2013, then it will yield 2013/11/15/dump
  • So -w $dumpdir/`date +"%Y/%m/%d/"`dump will store capture file as /var/log/pcap/2013/11/15/dump00001 and the -C switch will create dump00002 as soon as dump00001 reaches the 100MB's limit.

The script returns a list of files matching start and end time constraints provided as arguments.

Some examples of arguments the script is able to handle:

raskal$ ./digpcap.py 
usage: digpcap.py [-h] -f FROMDATE [-t TODATE] [-s SEARCH] [-v]
digpcap.py: error: argument -f/--fromdate is required

  • -f (from date) is the only mandatory argument required to avoid the above error message. It is the date you want to start searching for ... something.
  • Without -t (to date), it's defaulting to now().
  • -s is an optional argument, and it's not in use at the moment (look at Script content for the roadmap).
  • -v switches to verbose mode and therefore cannot be used to pipe the output of the script to another command like tshark (in a for loop context, see below).
  • -h ... well, usage and some examples....
Script dependencies:

- python-dateutil
- capinfos

- Update the CAP_FOLDER global variable to reflect your own setup
- CAP_FOLDER is the place where tcpdump capture files are stored.
CAP_FOLDER example (no trailing / slash at the end please): 
CAP_FOLDER = '/var/log/pcap'
The -v argument switch (-v output in blue)
raskal$ ./digpcap.py -v -f "Dec. 11, 2013 14:40:00"
Will dig None from 2013-12-11 14:40:00 to 2013-12-11 14:49:51.544586
2013/12/11/dump11007 interval: 2013-12-11 14:39:08 to 2013-12-11 14:47:22
2013/12/11/dump11008 interval: 2013-12-11 14:47:22 to 2013-12-11 14:53:51
2013/12/11/dump11007
2013/12/11/dump11008


As you can mention, the -v argument produces more output not suitable for further piping and processing. Use it to check that the script is not messing around.
Last example: searching for capture files stored between November the 4th at 11:30am and November the 5th at 3:46:50am (no -v switch this time).
raskal$ ./digpcap.py -f "Nov. 4, 2013 11:30" -t "Nov. 5, 2013 3:46:50" 
2013/11/04/dump09007
2013/11/04/dump09008
2013/11/04/dump09009
2013/11/04/dump09010
2013/11/05/dump00001
2013/11/05/dump00002
Note: at midnight, a cron job is restarting the tcpdump process thus dump##### numbering is restarting afresh.
Without -v you can further process each file one by one, your imagination is the limit.(Note: no need to enter the time in HH:MM:SS format, HH is enough) raskal$ for f in `./digpcap.py -f "Aug. 23, 2012 14" -t "Aug. 24, 2012 14"`; do echo "Look Ma, got " $f; done Look Ma, got 2012/08/23/ipv4.pcap Look Ma, got 2012/08/23/ipv6.pcap Look Ma, got 2012/08/24/fragment.pcap Look Ma, got 2012/08/24/link.pcap The script... digpcap.py (version 0.3)

Sunday, October 27, 2013

ICANN Arabic, Chinese and Cyrillic GTLDs versus pattern matching










Saturday, August 3, 2013

VirtualBox: Windows size in 4:3 ratio only

Ever being upset that your Virtualbox Guest Linux OS is not able to display something else than a 4:3 ratio display?
This can happen after a dist-upgrade...

Start the guest and launch a root terminal.
aptitude update
aptitude dist-upgrade (this way, you are up to date) 
/etc/init.d/vboxadd setup

That should do, if not make sure you have the correct kernel headers version installed.
Hint: After aptitude dist-upgrade... please reboot and only then launch vboxadd setup.


Saturday, May 4, 2013

Getting rid of LogMeIn on MAC OS X

So you tested LogMeIn or used it for just a couple of times and you are willing to remove it from your MAC OS X system, right?

Problem: after trashing the application, like you are used to do to de-install an apps, it's still there. There is even a taskbar icon that seems to be impossible to remove.

Solution: don't delete all the LogMeIn folders you can find on your MAC but rather use the provided - but somehow hidden - uninstaller.sh script.

Open Finder and navigate to /Library/Application Support/LogMeIn
In this folder there is a file named uninstaller.sh. Drag it to a terminal and prefix the command line with sudo, like this:

raskal:$ sudo /Library/Application\ Support/LogMeIn/uninstaller.sh
Enter your password...

Fixed, it's all gone... Almost!

They are some remaining files: browsers' plugins...
Navigate to /Library/Internet Plug-Ins. Depending on the browsers installed on your system you may find one or several LogMeIn plugin; e.g.,

  • LogMeIn.plugin
  • LogMeInSafari32.plugin
  • LogMeInSafari64.plugin

The better is to launch every browser and deactivate the plugin first; this is true for Firefox but not for Safari. For Safari, you must close it (Cmd-Quit) and then remove all LogMeIn*.plugin file, thus you should first deactivate the plugin from all browsers you have except Safari, then you can delete the plugin files safely and it will clean Safari from LogMeIn, finally :-)


Wednesday, April 17, 2013

FileRock: open-source storage cloud provider

FileRock is an open-source cloud storage solution.

At time of writing, it's in beta and membership is subject to invitation.
Update as of September 2013: It's no longer open-source. They are going to switch to another model (not yet disclosed at time of writing). What a pity! It is not the only open-source project that I am following (and testing, providing feedbacks to the community) that steps back to commercial... Well, it can happen and this is part of the game (of life).

It's open-source (code is available for audit)
  • 3 GB free storage
  • Automated backup
  • Sync with multiple devices (Win, Linux, MAC)

The good point is that the password that is used to encrypt your data at rest is never sent to the FileRock's servers (but you ought better not forget it (the password), as there is no way to recover your data in such case).

Data are copied to FileRock. "Copied" means that you are keeping the original data onto your own hard drive, and thus you can then work "offline".

FileRock servers are located in Milano - Italy (http://www.seeweb.it/) and maybe also in Frosinone (South of Roma - Italy). This provider is IPv6 enabled.

Concerned about the security? See these posts: 
You will find some other links within this post that are worth the read in order to fully understand the whole picture, and decide if this solution is secure enough for you.

I did register for an invitation! Now queuing for a mail to land in my Inbox :-)


Thursday, January 17, 2013


Aerohive, Euclid partner on using Wi-Fi to decode shoppers' behavior 


Is it well designed and implemented? At first sight it is, but is it temper proof against malicious activities? I'm not that sure...

Copy/paste from the AeroHive Solution Brief

Every Wi-Fi radio sends out standard probe signals searching for a Wi-Fi access point (AP) to attach to. The Aerohive AP detects that probe, “anonymizes” the unique MAC address by using a cryptographic hash function (or “hash”) then encrypts the data for transport to the Euclid cloud platform for processing. From that point, Euclid advanced heuristics use several different factors – including signal strength, ping frequency, and proximity to other access points (if any) – to determine the phone’s approximate location including if it is inside or outside the retail store, and then employs proprietary algorithms to create the analytics information used by business operations.
The Aerohive Cloud Services Platform connects to the Euclid cloud through a secure JavaScript Object Notation (JSON) connection to retrieve and present the resultant analytic information in a simple screen in our HiveManager Online cloud application.



As Security practitioner I am always wondering whether an implementation hasn't any weaknesses that would allow for  malicious activities to take place. In this case my concerns would be that as this solution is not using RFID, but MAC addresses, what about someone sitting in front of my store and constantly forging MAC addresses? 

Examples of tools to change MAC:

  • on GNU/Linux: macchanger, or even ifconfig...
  • on MS Windows: etherchange (run from the command prompt, thus easy to script), 
  • on MAC OS X: sudo ifconfig en0 Wi-Fi <New_MAC> (Lion) or sudo ifconfig en0 ether <New_MAC> if the former is not working.
  • myself I would opt for scapy (src_mac and the like) and Python, but any ifconfig trick can be scripted.
The other concern is the business model. Statistics are sent to Amazon AWS Cloud services in a secure manner, fine with me... But in the case of a DOS (MAC forging and flood) the collected data becomes irrelevant, and the amount of useless data shall increases the cost of the solution (AWS services fees are rather complicated to understand I must admit). 
Are they any protection (counter)measures available against such issues? 
Threshold on the AP, or when data are mangled on the Amazon's side? 
What about sending inexistant MAC OUI ? Is the "input" sanitized? (maybe, see the above figure which states that "Data is processed, cleaned and stored securely"). 

Well, they are for sure more concerns, but these are just the ones that are on top of my head at time of writing...
Oh yes! A "funny" one... Imagine anon distributing a DDOS program that would intentionally send the exact same MAC address for a given period of time, then generate a new one or fetch it from a C&C server and do it all again and again. I guess the statistics will become just a nice piece of (well, you got the idea I presume).

I hope that I am all wrong with my assumptions. I really like the AeroHive technology... Actually, this is maybe why I am affected and therefore writing this post ;-)