Showing posts with label Government. Show all posts
Showing posts with label Government. Show all posts

Saturday, February 08, 2014

Opening the door on DARPA - DARPA Open Catalog, the DRAPA sponsored open source projects and publications catalog

I Programmer - DARPA Opens Treasure Trove Of Data And Software

The Defense Advanced Research Projects Agency (DARPA) has created an open catalog containing results of its sponsored research in computer science.

As a result of requests from the R&D community, DARPA has created the DARPA Open Catalog, a place for organizing and sharing those results in the form of software, publications, data and experimental details. It contains the results of the programs that DARPA has invested in, covering both fundamental and applied research in computer science.

In an announcement about the new catalog, DARPA says it has an open source strategy for areas of work including big data to help increase the impact of government investments in building a flexible technology base.


DARPA - DARPA Open Catalog Makes Agency-Sponsored Software and Publications Available to All

DARPA has invested in many programs that sponsor fundamental and applied research in areas of computer science, which have led to new advances in theory as well as practical software. The R&D community has asked about the availability of results, and now DARPA has responded by creating the DARPA Open Catalog, a place for organizing and sharing those results in the form of software, publications, data and experimental details. The Catalog can be found at

Many DoD and government research efforts and software procurements contain publicly releasable elements, including open source software. The nature of open source software lends itself to collaboration where communities of developers augment initial products, build on each other's expertise, enable transparency for performance evaluation, and identify software vulnerabilities. DARPA has an open source strategy for areas of work including big data to help increase the impact of government investments in building a flexible technology base. 

“Making our open source catalog available increases the number of experts who can help quickly develop relevant software for the government,” said Chris White, DARPA program manager. “Our hope is that the computer science community will test and evaluate elements of our software and afterward adopt them as either standalone offerings or as components of their products.”


DARPA Open Catalog


Welcome to the DARPA Open Catalog, which contains a curated list of DARPA-sponsored software and peer-reviewed publications. DARPA funds fundamental and applied research in a variety of areas including data science, cyber, anomaly detection, etc., which may lead to experimental results and reusable technology designed to benefit multiple government domains.

The DARPA Open Catalog organizes publically releasable material from DARPA programs, beginning with the XDATA program in the Information Innovation Office (I2O). XDATA is developing an open source software library for big data. DARPA has an open source strategy through XDATA and other I2O programs to help increase the impact of government investments.

DARPA is interested in building communities around government-funded software and research. If the R&D community shows sufficient interest, DARPA will continue to make available information generated by DARPA programs, including software, publications, data and experimental results. Future updates are scheduled to include components from other I2O programs such as Broad Operational Language Translation (BOLT) and Visual Media Reasoning (VMR).

The DARPA Open Catalog contains two tables:

  • The Software Table lists performers with one row per piece of software. Each piece of software has a link to an external project page, as well as a link to the code repository for the project. The software categories are listed; in the case of XDATA, they are Analytics, Visualization and Infrastructure. A description of the project is followed by the applicable software license. Finally, each entry has a link to the publications from each team's software entry.
  • The Publications Table contains author(s), title, and links to peer-reviewed articles related to specific DARPA programs.





Wow, that's a number of projects and pubs! There should be something of interest for just about any geek and remember, you, the US Tax Payer, paid for this, so go and get your dollar's worth! :)


Related Past Post XRef:
From A to W... The US Gov goes Git (and API crazy too). There's an insane about of data, API's and OSS projects from the US Government...
Happy Birthday You’ve grown so in the last year… (from 47 to 272,677 datasets), where your Local Government can get naked...(well, as in Budget Transparency, that is)
Whoa there's allot of the free NOAA [resources]

Tuesday, November 12, 2013

Whoa there's allot of the free NOAA [resources]

Government Book Talk - Be a NOAA-it-all with these FREE NOAA resources about the weather and oceans

In the morning when I get on the elevator up to my office in GPO headquarters, when they aren’t talking about sports, everyone is chatting about the weather. My colleagues compare the day’s weather with previous years and talk about what’s coming in the days and seasons ahead. Since Washington, DC’s weather varies greatly throughout the year (even through the day!), people in this area are always taking the pulse of the outdoors and our world. In a similar vein, the Department of Commerce’s National Oceanic and Atmospheric Administration or NOAA, the U.S. Government’s oldest scientific agency, is taking the pulse of the Earth, for our benefit.

You can be a NOAA-it-all with these FREE online resources

NOAA says that “Our reach goes from the surface of the sun to the depths of the ocean floor as we work to keep citizens informed of the changing environment around them.” As such, its mission is to understand and predict changes in climate, weather, oceans, and coasts, and to conserve and manage coastal and marine ecosystems and resources.

Once they’ve taken the Earth’s pulse, of course NOAA wants to share the output of their studies. The data the agency gathers when studying the Earth’s atmosphere and oceans is comprehensive, and it is a global leader in communicating how Earth’s atmosphere and water systems influence people’s lives and how they influence those systems. If you learn how to navigate the range of NOAA’s free online resources, including real-time and archived information, you will get the full benefit of this rich data.

NOAA educates and disseminates data from its many valuable services, including the National Marine Fisheries Service, the National Ocean Service, the National Weather Service and others. Their information comes packaged in videos, weather alerts, digital coastal charts, entire databases, atlases, podcasts, screensavers, sea sounds, field reports, tagging data, and an entire education Web site for teachers and kids. NOAA has resources for children that are as unique and valuable as their science.

Whether to Weather

For instance, NOAA’s Weather Systems and Patterns page has a multimedia, lessons and activities, real world data, background information and career profiles. A student who is interested in extreme weather can graph tornado air pressure in the lessons section, investigate the severe weather events page in the real world data, track a storm in the multimedia section, read the background on severe weather, and even flirt with future career possibilities in the career profile of the tornado chaser.



NOAA Education Resources


Multimedia Gallery



World Ocean Database 2013 (WOD13) is an update of World Ocean Database 2009 (WOD09). All data are available online presorted by 10 ° geographic squares, by year or by user specified criteria.

This is the World Ocean Database with data and quality control flags exactly as used for the World Ocean Atlas 2013 climatologies. It is a preserved record of input data. The main World Ocean Database dataset builds on this record and includes updates and new data.

Note: The WOD13 has extended standard depth levels. The 40 standard depth levels used in previous versions of WOD are all among the 137 standard depth levels used in WOD13, to provide continuity.


World Ocean Database and World Ocean Atlas Series


Hey, we're paying for it (at least those of us in the US) we might as well use what we can, right?

Wednesday, November 06, 2013

We're from MSDN Magazine and we're here to help... MSDN Magazine Government Special Issue

MSDN Magazine - Government Special Issue of MSDN Magazine

Visit the MSDN Magazine Web site and you’ll see we’ve been a bit busier than usual of late. In addition to the regularly scheduled November issue, featuring guidance on ASP.NET Web API, MVVM development, and Office 365 and Visual Studio 2013, this month we published an extra issue of MSDN Magazine focused specifically on government application development. The MSDN Magazine Digital Government Issue shines a light on a dynamic and fast-growing sector of application development, and offers useful case-in-point solutions to real world development challenges.

In the wake of a near-disastrous government shutdown and the actually-disastrous Web site roll out, it would seem November might be an inopportune time to shine a light on government-focused app dev. In fact, the opposite is true. The recent challenges show just how vital mature and advanced app dev is to government operation, whether it’s preserving continuity of service during a shutdown or enabling the successful rollout of a massive and complex Web-based enrollment program.

The focus of this special issue isn’t on big ticket internal government development, but rather on enabling citizen-facing services. The articles explore the opportunities independent developers and contractors have to create value by tapping into a growing array of government services, resources and data stores. The issue also looks at ways that government entities—particularly at the local level—can benefit from savvy software development.

Tim Kulp’s “Engage Governments and Communities with Open311,” for example, shows how a Javascript-based Windows Store application lets citizens to interact with local government, reporting issues like a misbehaving stoplight or dangerous pothole from a phone or Web browser...


I will not mention the ACA or related web sites... I will not.. oh, darn I already did... WELL in THAT case! :P

Actually, I think this is really kind of cool and the poor guys and gals doing work in this area really need all the help and support they can get (that's not snarky... can YOU imagine having Congress and the White House and all those agencies as your boss? The poor code slingers just don't have a chance... )

Tuesday, October 29, 2013

"We're from the Government and we're to help with Cybersecurity..." - NIST Preliminary Cybersecurity Framework Released

ride the lightning - NIST Releases Preliminary Cybersecurity Framework

How in the world are entities supposed to deal with cybersecurity in a world without standards, even voluntary ones? The National Institute of Science and Technology (NIST) is looking to remedy that. On October 22nd, NIST released a Preliminary Cybersecurity Framework to help critical infrastructure owners and operators reduce cybersecurity risks in industries such as power generation, transportation and telecommunications. NIST will open a 45-day public comment period on the Preliminary Framework and plans to release the official framework in February 2014.


NIST Releases Preliminary Cybersecurity Framework, Will Seek Comments

The U.S. Department of Commerce's National Institute of Standards and Technology (NIST) today released its Preliminary Cybersecurity Framework (PDF) to help critical infrastructure owners and operators reduce cybersecurity risks in industries such as power generation, transportation and telecommunications. In the coming days, NIST will open a 45-day public comment period on the Preliminary Framework and plans to release the official framework in February 2014, as called for in Executive Order 13636—Improving Critical Infrastructure Cybersecurity.

In February 2013, President Obama directed NIST to work with stakeholders to develop a voluntary framework for reducing cyber risks, recognizing that U.S. national and economic security depends on the reliable functioning of critical infrastructure. Through a request for information and a series of workshops held throughout 2013, NIST engaged with more than 3,000 individuals and organizations on standards, best practices and guidelines that can provide businesses, their suppliers, their customers and government agencies with a shared set of expected protections for critical information and IT infrastructure.

"Thanks to a tremendous amount of industry input, the voluntary framework provides a flexible, dynamic approach to matching business needs with improving cybersecurity," said Under Secretary of Commerce for Standards and Technology and NIST Director Patrick Gallagher. "We encourage organizations to begin reviewing and testing the Preliminary Framework to better inform the version we plan to release in February."

The Preliminary Framework outlines a set of steps that can be customized to various sectors and adapted by both large and small organizations while providing a consistent approach to cybersecurity. It offers a common language and mechanism for organizations to determine and describe their current cybersecurity posture, as well as their target state for cybersecurity. The framework will help them to identify and prioritize opportunities for improvement within the context of risk management and to assess progress toward their goals.

The framework will foster communications among internal and external stakeholders and help organizations hold each other accountable for strong cyber protections while allowing flexibility for specific approaches tailored to each business' market and regulatory environment. Its integrated approach focuses on outcomes, rather than any particular technology, to encourage innovation.




It's just too easy to take pot shots, so I won't. I'll applaud our Government in at least trying to do this. IMHO, this is a good kind of role for government...


Related Past Post XRef:
Never a Cloudy day in DC? Not if the NIST has anything to say about it... "DRAFT Cloud Computing Synopsis and Recommendations"

Friday, October 04, 2013, where your Local Government can get naked...(well, as in Budget Transparency, that is) - Simi Valley



What an awesome way to grok my home town's budget. While you'd think "budget = boring" this sight makes it actually fun to look at, explore and spelunk the budget. It's very eye opening to see where all the money is going...

Wednesday, September 25, 2013

Space Mining - Science Fiction or just Science? (NASA wonders about exo-solar Diamond space mining)

open.NASA - Awesome Stuff in Space: Planet Mining GONE

You may have heard of the hyper-startup Planetary Resources, a company which aims to “expand Earth’s natural resource base” by developing (and eventually using) the technology to mine asteroids in the Solar System. They also have a lot of money, with investors such as Larry Page (Co-founder of Google) and James Cameron (Writer of Rambo: First Blood Part II). But what if they’re thinking a bit too small? There’s an exoplanet called 55 Cancri e which orbits the star 55 Cancri A ~41 light-years away from Earth. What makes 55 Cancri e interesting is that it weighs about 8.63 times the mass of the Earth and there is a good chance that about third of that mass is diamond. That’s a lot of diamond. Let’s be optimistic and assume for the moment that 55 Cancri e does in fact contain 2.88 times the mass of the Earth on diamond. What would happen if Planetary Resources really turned on the afterburner and tried to mine 55 Cancri e?


It would be prudent to open up the wormhole portal thing reasonable far away from Earth, we’ll use 35,786km because that’s the same height as a Geosynchronous orbit and I can calculate how much it costs to to get our mining equipment up there! Using the SpaceX Falcon Heavy launch system, currently the world’s most powerful rocket, it costs $130 million to get 21.2 tonnes up to that altitude. I don’t know much about mining equipment, so I just found the most expensive looking Drill I could. I know it probably won’t be the right kind of thing, but I’m going to use it’s mass as a benchmark for the weight for the fancy laser-using robotic miners we’re going to deploy to 55 Cancri e. Fortunately, the expensive looking drill weighs 20.4 tonnes, so we’ll say each Robo-Miner thing costs $130 million to get to 55 Cancri e (assuming the wormhole was already there or something). We’re also going to want to bring some transportation equipment to send the diamonds back to Earth! Since there likely aren’t any people on 55 Cancri e, we can probably put one of the wormhole entrances on the surface of 55 Cancri e (and figure out some way to slow down our equipment when it passes through the wormhole from Geostationary Transfer orbit). I’m going to use the specs of the SpaceX Dragon as a benchmark for what the specifications of our diamond return vehicle will be like (we’d need a souped up heatshield though!). A SpaceX dragon weighs 4.2 tonnes and can return to Earth with 3.13 tonnes of extra cargo, so if we use some fancy folding system we can assume that we can launch 5 diamond return vehicles on one SpaceX Falcon Heavy. Since R&D/Construction/Running costs are notoriously hard to estimate, let’s assume that the cost will be in the same order of magnitude as the Apollo program (I’ve excluded the cost of the launch vehicles, because we’re going to get them from SpaceX): $15.2 billion. I’m going to sum up our costs below:

Startup Costs: 20 Robo-Miners x $130 million each for launch = $2.6 billion. Total Estimated Costs = $2.6 billion + $15.2 billion = $17.8 billion (This does not take into account the cost of the Diamond Return Vehicle launches).

Cost of each Diamond Return Vehicle: $130 million for each launch/5 Vehicles = $26 million.

Current estimated price of 1 Tonne of Diamond: $60 million for 0.01192 kilograms. Therefore 1 Tonne of Diamond is currently worth $83892 million.

Cost to return 1 Tonne of Diamond: Each Diamond Return Vehicle costs $26 million and can return 3.13 Tonnes of Diamond. Therefore 1 Tonne costs $8.3 million

Profit on each Tonne of Diamond: $83892 million – $8.2 million = ~$84.9 billion

As you can see from the above, this would be very profitable! Keep in mind the effect INFINITY DIAMOND would have on the price of diamond (Economists, help me out!) It would give the miners of 55 Cancri e essentially a monopoly in the diamond industry though, and each capsule returning to Earth would make ~$265.7 billion in profit with the diamond prices of today! This would likely make the extremely high R&D costs much easier to stomach!


I SO want to be a space miner. If our race is to survive, we need to get off this rock... and if there's money to be made doing it, well so much the better! :)

Tuesday, September 10, 2013

"Winging It" is a pretty rough Preparedness Plan (aka. "Fake it 'till you make it" doesn't work as an Emergency Plan) - Winging It Is Not an Emergency Plan

Emergencies can occur with no warning. Do you have a supplies kit and a plan of action?

September is National Preparedness Month. Visit for guidance on what to before, during, and after different kinds of natural disasters and other emergencies.

Another action you can take is to join the National Preparedness Community. It’s free and open to all. As a member, you’ll have access to special preparedness resources and can collaborate with others in your community.

FEMA - National Preparedness Community


National Preparedness Month 2013 Toolkit

The National Preparedness Month (NPM) 2013 Toolkit includes suggestions for activities and events that state, local, tribal and territorial governments, business, non-governmental organizations, and community organizations could sponsor to promote NPM.

This toolkit also includes templates and drafts of newsletter articles, blogs, posters, and other collateral material that you are able to use in various outreach efforts. As you familiarize yourself with the toolkit, keep in mind the audiences that you work with, and select the tools that are best able to help your organization reach them most effectively.

The National Preparedness Month (NPM) 2013 Toolkit is available below in its entirety. Individual chapters and resources by audience from the NPM Toolkit are also listed below according to the Toolkit's Table of Contents


You've heard me stand on the soapbox before, so I won't repeat it all. Just let me say preparedness is a process, not a destination...


Related Past Post XRef:
Prep'ing your Pets, National Animal Disaster Preparedness Day is May 8th
When the unthinkable happens, make sure you think about your pets... Caring for your animals in a disaster means preparing now.
CDC does Earthquakes, Preparedness that is...
"Homebuilders' Guide to Earthquake-Resistant Design and Construction" Free PDF from FEMA
While it can be too later to prepare, it's never too early... Being ready is not hard, just do it, bit by bit...
Be prepared and know what you've got - Free Home Inventory Spreadsheet
A 2012 Survival Guide from How Stuff Works (Nonsense, but being prepared isn’t)
Centers for Disease Control (US CDC) provides Zombie Apocalypse 101 Survival Tips (really...) - And Zombie badges too!
Live on Earth? Then you live in a earthquake zone (your local activity may vary). When should you think about preparing for one? Um… Now!
Your Evacuation Plan – Do you have one? The time to make one is now, BEFORE you really need it…
National Preparedness Month: Don’t be afraid.. Be Ready

Monday, August 26, 2013

Buying your own Mobile Launch Platform from NASA, bid now...

DVICE - Bid on a piece of NASA history

Have you ever wanted to own a piece of NASA history? Here's your chance: NASA is currently seeking bidders for three of its launch pads used during moon missions. Originally built in 1967, each 3700 ton pad was officially used at the Kennedy Space Center to not only carry the Apollo moon program's rockets from an assembly area to the launch site, but to also send those rockets into space. The launch pads were later redesigned to accommodate space shuttles and were used regularly until 2010. Note that it's just the pads themselves that are for sale, not the crawler transporters.

Due to the enormous size and weight of the pads, moving them to a museum would be extremely difficult, and few have adequate space to store and display them. Considering that the launch pads come equipped with the necessary equipment, supplies and connections to launch a rocket...


Synopsis - Aug 16, 2013
RFI - Mobile Launch Platforms - Posted on Aug 16, 2013
General Information

Solicitation Number:

Posted Date:
Aug 16, 2013

FedBizOpps Posted Date:
Aug 16, 2013

Recovery and Reinvestment Act Action:

Original Response Date:
Sep 06, 2013

Current Response Date:
Sep 06, 2013

Classification Code:
W -- Lease or Rental of equipment


Contracting Office Address
NASA/John F. Kennedy Space Center, Procurement, Kennedy Space Center, FL 32899

This notice is issued by the NASA/KSC to post a Request for Information via the internet, and solicit responses from interested parties. This document is for information and planning purposes and to allow industry the opportunity to comment and respond to this request. Interested parties are invited to submit written comments or questions to the Contracting Officer listed below no later than August 30, 2013. When responding please reference RFI-KSC-MLP2013.

This presolicitation synopsis and Request for Information is not to be construed as a commitment by the Government, nor will the Government pay for the information submitted in response. Respondents will not be notified of the results.

See the attached RFI file for further details.

The mentioned RFI doc's are actually pretty cool. If you've ever wondered what made up a launchers, check the doc's out...

Introduction: NASA Kennedy Space Center (KSC) is soliciting information and/or concepts relating to traditional and non-traditional reuse or disposal options for the former Apollo and Space Shuttle Mobile Launch Platforms (MLPs). These large structures are no longer in use at KSC, and currently there is not a foreseeable Agency need. NASA currently has no appropriated funds for any divestment option.

The MLPs were used by KSC for stacking, transporting, and launching operations during a Space Shuttle flow. The MLPs were originally constructed and used for the Apollo Program in the 1960’s – early 1970’s, and were then renovated and modified to support the Space Shuttle Program. The last launch off an MLP was July 8, 2011.


Figure 1 & 2. MLP-1 at KSC Park Site, MLP-1 underneath

These three (3) nearly identical MLPs are comprised mostly of steel and weigh approximately 8.2 million pounds each. They measure 160’x135’x25’ (Length x Width x Height). The height does not include the holding post shown in the Figure 1. They also have three (3) flame holes shown in Figure 2. Each MLP is a two-story hollow structure featuring an elaborate maze of pathways, compartments, plumbing, and electrical cabling.

Purpose: The purpose of this Request for Information (RFI) is to gather data for KSC to assess potential divestment strategies for one (1) or more of the three (3) MLP(s) available. This RFI requests interested parties to provide concepts and supporting information on one or several of the options listed below. KSC may be willing to enter into reimbursable agreements to provide working area, heavy equipment operations, and unique engineering support. All options may occur after completion of the General Services Administration (GSA) property disposal process, through which NASA would completely divest ownership responsibilities for the MLPs.




Wednesday, July 31, 2013

Opening the U.S. Code, does the U.S. House, release in XML it does...

E Pluribus Unum - U.S. House of Representatives publishes U.S. Code as open government data

Three years on, Republicans in Congress continue to follow through on promises to embrace innovation and transparency in the legislative process. Today, the United States House of Representatives has made the United States Code available in bulk Extensible Markup Language (XML).

“Providing free and open access to the U.S. Code in XML is another win for open government,” said Speaker John Boehner and Majority Leader Eric Cantor, in a statement posted to “And we want to thank the Office of Law Revision Counsel for all of their work to make this project a reality. Whether it’s our ‘read the bill’ reforms, streaming debates and committee hearings live online, or providing unprecedented access to legislative data, we’re keeping our pledge to make Congress more transparent and accountable to the people we serve.”

House Democratic leaders praised the House of Representatives Office of the Law Revision Counsel (OLRC) for the release of the U.S. Code in XML, demonstrating strong bipartisan support for such measures.

“OLRC has taken an important step towards making our federal laws more open and transparent,” said Whip Steny H. Hoyer, in a statement.


“Just this morning, Josh Tauberer updated our public domain U.S. Code parser to make use of the new XML version of the US Code,” said Mill. “The XML version’s consistent design meant we could fix bugs and inaccuracies that will contribute directly to improving the quality of GovTrack’s and Sunlight’s work, and enables more new features going forward that weren’t possible before. The public will definitely benefit from the vastly more reliable understanding of our nation’s laws that today’s XML release enables.” (More from Tom Lee at the Sunlight Labs blog.)


“Last year, we reported that House Republicans had the transparency edge on Senate Democrats and the Obama administration,” he said. “(House Democrats support the Republican leadership’s efforts.) The release of the U.S. Code in XML joins projects like and in producing actual forward motion on transparency in Congress’s deliberations, management, and results.

For over a year, I’ve been pointing out that there is no machine-readable federal government organization chart. Having one is elemental transparency, and there’s some chance that the Obama administration will materialize with the Federal Program Inventory. But we don’t know yet if agency and program identifiers will be published. The Obama administration could catch up or overtake House Republicans with a little effort in this area. Here’s hoping they do.”

House of Representatives - US Code Most Current Release Point

Public Law 113-21
(Titles in bold are updated at this release point)

Information about the currency of United States Code titles is available on the Currency page.


The United States Code in XML uses the USLM Schema. That schema is explained in greater detail in the USLM Schema User Guide. For rendering the XML files, a Stylesheet (CSS) file is provided.

Each update of the United States Code is a "release point". This page contains links to downloadable files for the most current release point. The available formats are XML, XHTML, and PCC (photocomposition codes, sometimes called GPO locators). Certain limitations currently exist. Although older PDF files (generated through Microcomp) are available on the Annual Historical Archives page, the new PDF files for this page (to be generated through XSL-FO) are not yet available. In addition, the five appendices contained in the United States Code are not yet available in the XML format.

Links to files for prior release points are available on the Prior Release Points page. Links to older files are available on the Annual Historical Archives page.


While pretty cool, I was expecting something different. Seems the XML is really pretty much XHTML. So while it IS XML, it's still a display markup schema...


Guess we'll have to wait for this to complete, Legislative Data Challenge - Win $5k challenge by helping the Library of Congress make US laws machine readable.... Still I applaud the effort!


Related Past Post XRef:
Legislative Data Challenge - Win $5k challenge by helping the Library of Congress make US laws machine readable...
From A to W... The US Gov goes Git (and API crazy too). There's an insane about of data, API's and OSS projects from the US Government...

Friday, July 26, 2013

National app privacy code of conduct released by the US (and it's only 6 pages... Well the short form is anyway...)

The Verge - US government announces first national app privacy code of conduct

Mobile apps and consumers' expectations of privacy are not always in sync. But now the government is on the case. The US National Telecommunications and Information Administration (NTIA), part of the Commerce Department, has finished working on the first version of a voluntary national "privacy code of conduct" for mobile apps. The NTIA has been working on the code of conduct for over a year, and written (and rewritten) numerous drafts, trying to balance the input of industry players including AT&T and the Internet Commerce Coalition (which represents AOL and Ebay among others), with privacy groups, such as the Electronic Frontier Foundation and the American Civil Liberties Union. But now, it has finally managed to come up with a draft that it says should satisfy all sides, while also helping protect consumer privacy.

The code of conduct is again, strictly voluntary and not enforced under any laws, so it's up to app developers and ecosystems to adopt it at their will. But the NTIA is hopeful that they will in short order...


I. Preamble: Principles Underlying the Code of Conduct Below is a voluntary Code of Conduct for mobile application (“app”) short notices developed through the Multi-Stakeholder Process on Application Transparency convened by the United States Department of Commerce. The purpose of the short form notices is to provide consumers enhanced transparency about the data collection and sharing practices of apps that consumers use. This code does not apply to software that a consumer does not interact directly with or to inherent functions of the device.  This code also does not apply to apps that are solely provided to or sold to enterprises for use within those businesses. 

This Code of Conduct incorporates guidance from privacy, civil liberties, and consumer advocates, app developers, app publishers, and other entities across the mobile ecosystem. The transparency created by displaying information about application practices in a consistent way as set forth in this code is intended to help consumers compare and contrast data practices of apps. These short notices seek to enhance consumer trust in app information practices without discouraging innovation in mobile app notice or interfering with or undermining the consumer’s experience.  

This preamble explains the goals of the Code of Conduct and provides some guidance to developers regarding implementation. However, it does not impose operational requirements beyond those set forth in Sections II., III., and IV. below.   

Where practicable, app developers are encouraged to provide consumers with access to the short notice prior to download or purchase of the app.   When appropriate, some app developers may elect to offer short form notice in multiple languages.  

App developers should be aware that there are other Fair Information Practices (FIPs) beyond transparency; app developers are encouraged to adhere to the full set of FIPs.    This Code of Conduct addresses short form notices about collection and sharing of consumer information with third parties. App developers should be aware that California’s Online Privacy Protection Act and other privacy laws may also require app developers to post a long form privacy policy...


I so want to insert something snarky, but I'm frankly tired of that (well I'm never tired of being snarky, but the prism, NSA, kinect is a spy, tin foil hat, omg my cat's going to kill me in my sleep, yada, yada, stuff). Anyway, it's good to see some active guidance that looks reasonable and actionable...

Thursday, July 25, 2013

"Manhattan District History" - History of the Manhattan Project is becoming available online (and free)

NextGov - History of the Atom Bomb Goes Online

The Department of Energy has started to post online the internal history of the first atomic bombs.

It was commissioned by Lt. Gen Leslie Groves, head of the Manhattan Project, which managed the nationwide complex of labs and factories that developed and produced the raw material for the first atom bombs in a crash three-year project that eventually employed 130,000 people and cost $26 billion in current dollars.

The Manhattan District History consists of 36 volumes grouped in eight books, with a third of the volumes, or parts of volumes, still classified. DoE said the rest of the volumes have been declassified, with some made available to the public on microfilm.

One of the online documents offers fascinating insights into Operation Peppermint, which aimed to determine whether the Germans had developed a radiological weapon, using among other things film distributed to troops to detect radiation fogging.


Classified volumes will be declassified with redactions, and the remaining unclassified parts made available to the public, posted incrementally as review and processing is completed.


U.S. Department of Energy - Manhattan District History

General Leslie Groves, head of the Manhattan Engineer District, in late 1944 commissioned a multi-volume history of the Manhattan Project called the Manhattan District History. Prepared by multiple authors under the general editorship of Gavin Hadden, a longtime civil employee of the Army Corps of Engineers, the classified history was "intended to describe, in simple terms, easily understood by the average reader, just what the Manhattan District did, and how, when, and where." The volumes record the Manhattan Project's activities and achievements in research, design, construction, operation, and administration, assembling a vast amount of information in a systematic, readily available form. The Manhattan District History contains extensive annotations, statistical tables, charts, engineering drawings, maps, photographs, and detailed indices. Only a handful of copies of the history were prepared. The Department of Energy's Office of History and Heritage Resources is custodian of one of these copies.

The history is arranged in thirty-six volumes grouped in eight books. Some of the volumes were further divided into stand-alone chapters. Several of the volumes and stand-alone chapters were never security classified. Many of the volumes and chapters were declassified at various times and were available to the public on microfilm. Approximately a third of the volumes, or parts of volumes, remain classified.

The Office of Classification and the Office of History and Heritage Resources, in collaboration with the Department's Office of Science and Technical Information, have committed to making available full-text on this OpenNet website the entire thirty-six volume Manhattan District History. Unclassified and declassified volumes will be scanned and posted as available. Classified volumes will be declassified with redactions, i.e., still classified terms, phrases, sentences, and paragraphs will be removed and the remaining unclassified parts made available to the public. The volumes will be posted incrementally as review and processing is completed.

Following is a listing of the books, volumes, and stand-alone chapters of the Manhattan District History. Links to pdf copies are provided for those volumes and chapters that currently are available.


Book 1 - Volume 8 - Personnel



Not something you might want to take on vacation for light reading, but I love this kind of stuff...

[Irony alert] Worried about the NSA reading your emails? Heck they can't bulk search their own...

ProPublica - NSA Says It Can’t Search Its Own Emails

The NSA is a "supercomputing powerhouse" with machines so powerful their speed is measured in thousands of trillions of operations per second. The agency turns its giant machine brains to the task of sifting through unimaginably large troves of data its surveillance programs capture.

But ask the NSA, as part of a freedom of information request, to do a seemingly simple search of its own employees' email? The agency says it doesn’t have the technology.

"There's no central method to search an email at this time with the way our records are set up, unfortunately," NSA Freedom of Information Act officer Cindy Blacker told me last week.

The system is “a little antiquated and archaic," she added.

I filed a request last week for emails between NSA employees and employees of the National Geographic Channel over a specific time period. The TV station had aired a friendly documentary on the NSA and I want to better understand the agency's public-relations efforts.

A few days after filing the request, Blacker called, asking me to narrow my request since the FOIA office can search emails only “person by person," rather than in bulk. The NSA has more than 30,000 employees.

I reached out to the NSA press office seeking more information but got no response.

It’s actually common for large corporations to do bulk searches of their employees email as part of internal investigations or legal discovery.


Well of course THEIR emails are not going through their uber-supper-duper email tracking/search thing! That wouldn't be secure!  :P

Sunday, July 21, 2013

Is the DHS/Department of Homeland Security Following You (on Twitter)?

public intelligence - List of Twitter Accounts Followed by DHS Media Monitoring Capability

The following is a list of Twitter accounts followed by the official account of the Department of Homeland Security National Operations Center Media Monitoring Capability.  The document was obtained via FOIA request by Carlton Purvis using

1. DHS NOC MMC uses two Twitter applications: TweetDeck and TweetGrid

2. DHS NOC MMC does NOT use any Twitter widgets

3. DHS NOC MMC follows the Twitter accounts shown in the list below. NOTE, in accordance with DHS Privacy direction, DHS NOC MMC follows only authorized accounts and never follows accounts of private individuals.

Twitter accounts followed by @DHSNOCMMC1:


I just thought this was interesting and heck, I found a number of accounts that were interesting (and am now follow too... :)

If you're looking for a list of news accounts, this is a pretty darn good one...

Friday, July 19, 2013

Wednesday, July 17, 2013

Legislative Data Challenge - Win $5k challenge by helping the Library of Congress make US laws machine readable...

Nextgov - Contest Aims to Make Proposed U.S. Laws Machine Readable Worldwide

The Library of Congress is crowdsourcing an initiative to make it easier for software programs around the world to read, understand and categorize federal legislation.

The library is offering a $5,000 prize to the contestant whose entry best fits U.S. legislation into Akoma Ntoso, an internationally-developed framework that aims to be the standard for presenting legislative data in machine-readable formats.


News from the Library of Congress - Library of Congress Announces Legislative Data Challenge

The Library of Congress, at the request of the U.S. House of Representatives, is utilizing the platform to advance the exchange of legislative information worldwide.

Akoma Ntoso ( is a framework used in many other countries around the world to annotate and format electronic versions of parliamentary, legislative and judiciary documents. The challenge, "Markup of U.S. Legislation in Akoma Ntoso", invites competitors to apply the Akoma Ntoso schema to U.S. federal legislative information so it can be more broadly accessed and analyzed alongside legislative documents created elsewhere.

"The Library works closely with the Congress and related agencies to make America’s federal legislative record more widely available through," said Robert Dizard Jr., Deputy Librarian of Congress. "This challenge will build on that accessibility goal by advancing the possibilities related to international frameworks. American legislators, analysts, and the public can benefit from international standards that reflect U.S. legislation, thereby allowing better comparative legislative information. We are initiating this effort as people around the world are working to share legislative information across nations and other jurisdictions."

Utilizing U.S. bill text, challenge participants would attempt to markup the text into electronic versions using the Akoma Ntoso framework. Participants will be expected to identify any issues that appear when applying the Akoma Ntoso schema to U.S. bill text, recommend solutions to resolve those issues, and provide information on the tools used to create the markup.

The challenge, which opened today and closes Oct. 31, 2013, is extended to participants 18 years of age or older. For the official rules and more detailed information about the challenge or to enter a submission, visit

The competition’s three judges are experts in either U.S. legislation XML standards or the Akoma Ntoso legal schema. The Library of Congress will announce the winner of the $5,000 prize on Dec. 19, 2013.


Akoma Ntoso

Akoma Ntoso (“linked hearts” in Akan language of West Africa) defines a “machine readable” set of simple technology-neutral electronic representations (in XML format) of parliamentary, legislative and judiciary documents.

Akoma Ntoso  XML schemas make “visible” the structure and semantic components of relevant digital documents so as to support the creation of high value information services to deliver the power of ICTs to increase efficiency and accountability in the parliamentary, legislative and judiciary contexts.

Akoma Ntoso is an initiative of "Africa i-Parliament Action Plan" ( a programme of UN/DESA.


I'm trying really hard to be supportive of this and not be snarky (like at least with this, something will read the laws congress passes... OH darn, see what I mean? ;)

Tuesday, June 04, 2013

Gov Webicons - Your one shop, 41 Agency stop for US Government agencies icons (SVG/PNG)

open.NASA - Introducing Gov Webicons


After doing some research on SVG images on the web, I came across FC Webicons, a set of social media icons presented in SVG with PNG fallbacks for older browsers that don’t support SVG images. As that code is open source, I adopted it to present NASA’s logo in a futuristic, resolution independent way…but decided, hey, it was so much fun, why stop there?

I’m happy to introduce today Gov Webicons, a set of 41 federal agency icons that you can use on your website with just two lines of code. Creating a dashboard of agencies? Include them all. Want to just update your own agency’s icon? Take out just the code and images you need and have fun living in the future.

Gov Webicons is open source and hosted on GitHub. I’d love to include additional federal agencies and can do so as long as there is a publicly available SVG version of their graphic online. Any tips on improving Gov Webicons or ideas for more agencies to add? Drop me a line in the comments below, I’d love to hear your thoughts!

seanherron / Gov-Webicons

Gov Webicons Set is a set of resolution-independent social icons for use on your website. They use feature-detected SVG graphics (with PNG fallbacks) to display the icons over their appropriate negatively indented anchor titles.

If you have icon suggestions, either submit a pull request or open an issue with a link to the SVG file of the agency you wish to see included.

Enjoy Gov Webicons!

Based on FC Webicons by (Code is CC-Attrib; attrib to


  1. Include gov-webicons.css in your HTML
  2. Use the following code to include a federal agency icon:


Included Agencies

  • Air Force
  • Archives
  • Army
  • Centers for Disease Control and Prevention
  • Coast Guard
  • Congress
  • Department of Homeland Security
  • Department of Commerce
  • Department of Defense
  • Department of Energy
  • Department of Justice
  • Department of Transportation
  • Environmental Protection Agency
  • Food and Drug Administration
  • Federal Emergency Management Agency
  • Government Accountability Office
  • General Services Administration
  • Health and Human Services
  • Housing and Urban Development
  • Immigration and Customs Enforcement
  • Department of Interior
  • Department of Labor
  • Library of Congress
  • National Aeronautics And Space Administration
  • Navy
  • National Institute of Standards and Technology
  • National Oceanic and Atmospheric Administration
  • Office of Personnel Management
  • Office of Science and Technology Management
  • Peace Corps
  • Executive Office of the President
  • Patent and Trademark Office
  • Small Business Administration
  • Secret Service
  • State Department
  • U.S. Treasury
  • Seal of the United States
  • United States Agency for International Development
  • U.S. Department of Agriculture
  • Veterans Administration


Come on, you know you've wanted a NASA, DEA, US Treasury, Secret Service, etc web icon on your site...



Friday, May 24, 2013

From A to W... The US Gov goes Git (and API crazy too). There's an insane about of data, API's and OSS projects from the US Government...

Nextgov - White House Releases New Tools for Digital Strategy Anniversary

The White House marked the one-year anniversary of its digital government strategy Thursday with a slate of new releases, including a catalog of government APIs, a toolkit for developing government mobile apps and a new framework for ensuring the security of government mobile devices.

Those releases correspond with three main goals for the digital strategy: make more information available to the public; serve customers better; and improve the security of federal computing.


DATA.Gov - Developer Resources




Government Open Source Projects




That list of API's and projects just blows my mind... I mean... wow. If you're looking to wander through some code, there HAS to be something here that you'll find interesting. There's something for every language, platform and interest, I think...


Related Past Post XRef:
Happy Birthday You’ve grown so in the last year… (from 47 to 272,677 datasets)

Saturday, May 11, 2013

The NSA Untangles the Web - 651 Pages of NSA Web Searching'ness...

The Verge - NSA reveals its internet search tricks in the recently declassified 'Untangling the Web'

The National Security Agency has declassified a version of an in-house training manual used to teach NSA members how to best utilize the internet for research purposes. Untangling the Web: An Introduction to Internet Research was written by Robyn Winder and Charlie Speight and published by the NSA's Center for Digital Content back in 2007. It's been declassified and made available now following a Freedom of Information Act requested lodged by MuckRock back in April.

The document weighs in at over 600 pages, and tends to be geared towards an audience that may not necessarily be familiar with or see the value of the internet in research (then again, it's important to remember that the book was published six years ago). While chapters like "Search Fundamentals" and "Why Do I Need Help?" paint a basic picture, it also dives a bit deeper with sections like "Google Hacking," which focuses on "using publicly available search engines to access publicly available information that almost certainly was not intended for public distribution." Confidential company data, secret government information, and financial data are all listed as the types of data such searches could uncover.


While a few years old, still some real interesting reading... One thing to note, this is a scanned document. So you'll need to let Acrobat OCR it before you can search it, etc.