List of free computer forensic tools
(didn't verify the licensing though, better check yourself please)
https://forensiccontrol.com/resources/free-software/
![]() |
Burp Suite MAC OSX icon (size: 400 x 400) |
PART II: FULL TEXT OF ANNOUNCEMENT
I. FUNDING OPPORTUNITY DESCRIPTION
The Defense Advanced Research Projects Agency (DARPA) is soliciting proposals for innovative research to maintain technological superiority in the area of content indexing and web search on the Internet. Proposed research should investigate approaches that enable revolutionary advances in science, devices, or systems. Specifically excluded is research that primarily results in evolutionary improvements to the existing state of practice.(...)
Overview
Today's web search is limited by a one-size-fits-all approach offered by web-scale commercial providers. They provide a centralized search, which has limitations in the scope of what gets indexed and the richness of available details. For example, common practice misses information in the deep web and ignores shared content across pages. Today's largely manual search process does not save sessions or allow sharing, requires nearly exact input with one at a time entry, and doesn't organize or aggregate results beyond a list of links.
The Memex program envisions a new paradigm, where one can quickly and thoroughly organize a subset of the Internet relevant to one’s interests.(...)
Technical Area 1: Domain-Specific IndexingErm... Maltego ?
Crawling should also be robust to automated counter-crawling measures, crawler bans based on robot behavior, human detection, paywalls and member-only areas, forms, dynamic and non-HTML content, etc.
Information extraction may include normalization of heterogeneous data, natural language processing for translation and entity extraction and disambiguation, image analysis for object recognition, coreference resolution, extraction of multimedia (e.g., pdf, flash, video, image), relevance determination, etc.
Technical Area 2: Domain-Specific Search
Technical Area 2 includes the creation of a configurable domain-specific interface into web content. The domain-specific interface may include: conceptually aggregated results, e.g., a person; conceptually connected content, e.g., links for shared attributes; task relevant facets, e.g., key locations, entity movement; implicit collaboration for enriched content; explicit collaboration with shared tags; recommendations based on user model and augmented index, etc.
Also, TA2 performers will work with TA1 performers on the design of a query language for directing crawlers and information extraction algorithms. A language to specify the domain, including both crawling as well as interface capability, may include concepts, sets of keywords, time delimitations, area delimitations, IP ranges, computational budgets, semi-automated feedback, iterative methods, data models, etc. Technical Area 3: Applications"Human Trafficking". Cool, no one can argue that it's not a legit one. I personally think the use case is relevant. Another one is money laundering, because it is very often bound to human trafficking and slavery. I suppose they are many other use cases that would benefit from such project.
Human Trafficking, especially for the commercial sex trade, is a line of business with significant web presence to attract customers and is relevant to many types of military, law enforcement, and intelligence investigations. The use of forums, chats, advertisements, job postings, hidden services, etc., continue to enable a growing industry of modern slavery. An index curated for the counter trafficking domain, including labor and sex trafficking, along with configurable interfaces for search and analysis will enable a new opportunity for military, law enforcement, legal, and intelligence actions to be taken against trafficking enterprises.
Other application domains will be considered during the life of the program, possibly including indexing and interfaces for found data, missing persons, counterfeit goods, etc.
Since technology development will be guided by end-users with operational support expertise, DARPA will engage elements of the DoD and other agencies to develop use cases and operational concepts around Human Trafficking and other domains.
Others? Circumstances? A bit vague...
2. Foreign Participation
Non-U.S. organizations and/or individuals may participate to the extent that such participants comply with any necessary nondisclosure agreements, security regulations, export control laws, and other governing statutes applicable under the circumstances.
D. Other Eligibility Requirements
1. Ability to Support Classified Development, Integration and Transition
While the program itself is unclassified, interactions with end-users and potential transition partners will require Technical Area 3 performers to have access to classified information. Therefore, at the time of award, all prime proposers to Technical Area 3 must have (at a minimum) Secret facility clearance and have personnel under their Commercial and Government Entity (CAGE) code with a Secret clearance that are eligible for TS/SCI. Technical Area 3 proposers must provide their CAGE code and security point(s) of contact in their proposals.
(...)
Proposers for Technical Areas 1 and 2 are not required to have security clearances.
If I am running a web server and configured this one not to be indexed (robot.txt), it is because I deliberately chose to do so ! If I am a member of a private or by-invitation only forum it is maybe because I am trying to dissociate my private and professional life. They are many other genuine examples I could mention.Crawling should also be robust to automated counter-crawling measures, crawler bans based on robot behavior, human detection, paywalls and member-only areas, forms, dynamic and non-HTML content, etc.