Geeks wear tinfoil hats too!
January 12, 2006 7:10 PM   Subscribe

National Information Exchange Model (NIEM) Sometimes, its the unheralded steps, that take you most quickly to your destination. On October 7, 2005, the U.S. Department of Homeland Security (DHS), the U.S. Department of Justice (DOJ), and their associated domains announced the first release of the National Information Exchange Model (NIEM) Version 0.1. NIEM "establishes a single standard XML foundation for exchanging information between DHS, DOJ, and supporting domains, such as Justice, Emergency Management, and Intelligence." The release of this specification, and the development of the systems that utilize it may actually be the cataylst for more 'progress' in information mining on the individual than most other, well publicized efforts. NIEM Mission: "To assist in developing a unified strategy, partnerships, and technical implementations for national information sharing — laying the foundation for local, state, tribal, and federal interoperability by joining together communities of interest." When you say it like that, it sounds sort of cool!
posted by sfts2 (18 comments total)
<cocks head> the sound of your privacy vanishing.
posted by spacewrench at 7:22 PM on January 12, 2006 [1 favorite]

NSA Blue--delicious!
posted by bardic at 7:30 PM on January 12, 2006

Global Justice XML Data Model (Global JXDM)

I love how important they make the rhetoric. This is not just a justice XML data model. This is a Global JXDM.
posted by effwerd at 7:46 PM on January 12, 2006

They can use a tagging system too.

Just think ! - your name plus.....



"suspected communist"
posted by troutfishing at 7:57 PM on January 12, 2006

actually, this is a good sign. it's in xml so I'm sure everyone will use different tools to generate and read it and they'll all be incompatible. plus there will be all these insane extensions no one understands that some agencies will use and others won't. so, privacy is safe for now. :)
posted by R343L at 7:59 PM on January 12, 2006

Heh. From the NIEM press release: "The NIEM 0.1 release contains a collection of fifty-four (54) XML schemas and a Component Mapping Template for use by reviewers. Based in part upon the Global Justice XML Data Model (Global JXDM), NIEM 0.1 defines 250 types, of which 54 are Universal, 107 are Common, and 89 are Domain Specific. It also defines 2213 Properties, of which 273 are classified as Universal, 943 are Common, and 997 are Domain Specific."

Not to worry. It's going to take them about a year to train anyone how to mark up data properly, six months to setup up a proprietary XML database system that can handle 54 schemas and several terabytes of data, another three months to debug the system when it breaks after the first week of production use, and then another year to bring the whole thing into compliance with XML Schema version 1.1 which will have been released in the meantime...
posted by Creosote at 8:02 PM on January 12, 2006

Here's the FAQ.

It tracks by SSN, so the best way to avoid it is to be born a foreigner of some sort. Bet Zarqawi never thought of that!
posted by swell at 8:04 PM on January 12, 2006

I'm just as terrified of my government spying on me as anyone, but this is the kind of thing I want to see: efficient and reliable federal communication. Put crazy legal controls on the "sharing" & enforce them with truncheons, but enjoy the benefits of technological process.
posted by bingbangbong at 8:13 PM on January 12, 2006

The UK police do wonders with video surveillance, I've heard.
posted by troutfishing at 8:25 PM on January 12, 2006

I'm kinda with you, bingbangbong. This technology could actually be used for good purposes. However, knowing the current gang that's in charge......
posted by Afroblanco at 8:56 PM on January 12, 2006

In looking at the FAQ, I noticed that every Type (i.e. object) can have reliability, probability, and a half-assed information provenance.

How much do you want to bet that most applications will ignore this info?

The reason I call the provenance half-assed is that it deals with primary sources, not information chains. I.e. it can encode for any "fact" things like:
reportingPersonText="Detective Bill Smith"
reportingOrganization="LA County PD"
but cannot say that the information has been transmitted through a chain of 5 databases, any one of which that is not 100% bug free may have added errors or omissions to the data.

If there is any question about the accuracy of a report, where do you begin to look?
posted by MonkeySaltedNuts at 10:50 PM on January 12, 2006

(aside) I'm just waiting for someone to mash this up with the Google Maps API. (/aside)

As for "controls," legal principle and policy are no more than the technology and processes that implement them. Technology is not neutrally valenced. Instead, tech architecture has profound impact on the ideals that it implements. The way to ensure to ensure that technology and process are "honest" to underlying legal principles is open and independent audit, perhaps oversight.

And it is that open "sunshine" that is the impossibility in this Administration. Take the NSA revelations--there is no oversight, even by the FISA court. Even taking the Administration's claims that the wiretapping respects personal privacy at face value (which you shouldn't), there is no basis to trust that the system implemented actually does that.

Every system has instrinsic design choices that may default one way or another. In the absence of oversight or audit, all that protects privacy are such "defaults" in the system. In this case, this "Global JXDM" favors information interchange. In the realm of personal information used by the DHS, this has strong reprecussions to ideals of privacy.

Also note that this effort is funded through the DOJ Office of Judicial Programs, which previously funded (see paragraph under RISS) the MATRIX project, and also is funding something called "Global," in a broader strategy of developing such systems (warning, large PDF). Draw your own conclusions.
posted by soda pop at 11:05 PM on January 12, 2006

posted by staggernation at 7:29 AM on January 13, 2006

I'm actually working on a state government project involving sharing information (human services) between different states. The feds were very involved because most of the money that made it possible for states to do this came from the feds by way of grants. They developed the XML schemas, documentation and html screens and the states built the webservices that handle the request/response as well as the web application that displays the information to the end user. All in all, I think it is an excellent experience. I feel I am qualified to answer some questions/concerns brought up here.

1) State/Feds are all about open source software, if they are technically capable of handling it. Most are. The feds definitely are and most of the states are as well.

2) Open format standards are good.

3) You would be surprised how fast most state entities can pick up new technology. Several states, with little/no web services experience, were able to 1) generate code from the XML schema/wsdl, 2) build the web application and 3) test out on the test schedule within 4-5 months.

4) The states are very cooperative with each other (in the open source sense)

5) This is data passing between different state entities on private networks. I really wouldn't have many 'privacy violation by hacker' concerns. Also, the feds are very concerned about who has access to what information and for a good reason. They don't want private information getting into the hands of non-state entities.

6) Tribal, as in native american tribes on tribal land. They are valid entities that the states deal with.
posted by kookywon at 8:38 AM on January 13, 2006

7) In our project there were several state using .NET servers, several using J2EE technologies (on different platforms), PERL, anything under the sun. There haven't been many incompatibility issues yet. Most of the problems have been agreeing on what particular data elements mean.

8) Global, in the US government usually means between different government entities. Not as in 'take over the world'. C'mon. I mean maybe they have plans to interact with interpol with it in the future, but they're probably concerned with just getting it up and running in the US for now.
posted by kookywon at 8:43 AM on January 13, 2006

Scheming schemas.
posted by furtive at 8:47 AM on January 13, 2006

Yeah, I don't get people being upset or weirded out by this. Why shouldn't our government have a well-defined vocabulary for the metadata that it collects? Why shouldn't incident records from disparate jurisdictions be annotated using the same vocabulary, and in the same manner? Hell, the healthcare industry -- which depends on having access to its data in a reliable way if it wants to be able to use it for research -- has incredibly intricate schemas and data storage models, and there are people whose sole jobs are to curate the datasets and make sure that the information contained within them adheres to the standards. If we want our government to be able to learn anything about security -- from incidents, from emergencies, from crimes -- then they need to be able to collaborate, and the only way for that to happen is for their data to conform to the same specs.
posted by delfuego at 5:20 PM on January 13, 2006

Oh, and I'll second kookywon -- the governmental healthcare agencies I've worked with are very into open-standard file formats and open-source software. Unlike many other groups and organizations, these agencies are excited by the idea of having their formats vetted by many other people, improved upon, and used to further research and patient care.
posted by delfuego at 5:23 PM on January 13, 2006

« Older Networks   |   The Trouble With Poetry Newer »

This thread has been archived and is closed to new comments