Showing posts with label GIS. Show all posts
Showing posts with label GIS. Show all posts

Sunday, February 27, 2011

Introducing (belatedly) nx_spatial

It's been more or less done a while, but here is finally a blog post about it.

nx_spatial is a collection of addon functions for the networkx python graph library. What can you do with it?

  1. Load GIS formats into networkx graphs (where you can do all sorts of crazy analytics on them)
  2. Perform upstream and downstream traces with stopping points.
  3. Set sources and find/repair edges that don't have the correct to/from nodes.

Example from the wiki:

>>> import nx_spatial as ns 
>>> net = ns.read_shp('/shapes/lines.shp') 
>>> net.edges() [[(1.0, 1.0), (2.0, 2.0)], [(2.0, 2.0), (3.0, 3.0)], [(0.9, 0.9), (4.0, 2.0)]] 
>>> net.nodes() [(1.0, 1.0), (2.0, 2.0), (3.0, 3.0), (0.9, 0.9), (4.0, 2.0)] 
>>> source = (2.0, 2.0) 
>>> ns.setdirection(net, source) 
>>> net.edges() [[(2.0, 2.0), (1.0, 1.0)], [(2.0, 2.0), (3.0, 3.0)], [(0.9, 0.9), (4.0, 2.0)]]

Available on pypi or bitbucket. Eventually I want to integrate it with networkx trunk (loading shapefiles is already in 1.4).

Posted via email from The Pragmatic Geographer

Tuesday, September 28, 2010

WhereCampPDX 2010: Some of the Sessions

It's going to take me some time to fully decompress and write this up, as a good portion of what I saw in Portland was new to me in some form. It is easy enough to read about the hacker community and fiddle with the cutting edge stuff like GeoCouch: it is quite another to actually meet the people and see what they are cooking up.

QTDesigner (Holly Glaser), GeoCouch/GeoJson (Max Ogden)

I hopped between sessions on this one, so I missed about half of Max's talk. The QT talk was small and had David Turner walking Holly through making a survey plugin.

The latter session was showed off an interop tool for converting shapefiles into GeoCouch and why you might want to use that datastore over say PostGIS. The larger point here, which was only briefly mentioned and which I'd like to examine more closely at a later date, is the assertion that simplicity in this sphere breeds adoption. I don't know I necessarily agree.

Domination of the Shapefile (James Fee)

I don't recall the real title, but the alt title was "Why I was Wrong in my last Keynote". It turns out Spatialite, though it was an improvement over the traditional method of bulk data delivery (masses of shapefiles), didn't have enough going for it to actually see mainstream adoption. There just isn't enough wrong with shapefiles to bother switching for data providers. Worse is simply better, at least in this case.

Still, there are problems in data interchange that need to be addressed, the big ones being (1) styling, (2) attribute relationships. I honestly don't see too big a problem with the shapefile being multiple files, since people have a tendency to zip up such things anyway and you probably want more than one shapefile at a time.

This was an interesting discussion and probably merits its own post, the examples were particularly illuminating (FGDC, GML, Gulf Spill map styling).

Keynote (Nathaniel Kelso)

Nathaniel works for the Washington Post making beautiful maps. He showed off examples - the accuracy of the coordinates for WhereCamp, Top Secret America, people estimation at the Obama inauguration - around the theme of "filling in" the map. Of an estimated 2.3 million habitated places we have good maps for a small fraction of them.

Multimodal Trip Planner (David Turner)

This talk is an odd intersection of network topology, real-time scheduling, dynamic mobile resources, and human behavior. When people say they only want to travel on bike lanes, does that mean they'll go five miles out of their way to do it? What if they want to bike only downhill by cleverly leveraging Portland's excellent mass transit system?

One thing I didn't bring up, and wish I had, is that this is an app that essentially lets people plan their travel around a relatively fixed mass transit schedule. What happens when everyone is using such information all the time? Would it be possible to start dynamically modifying the transit system itself to best integrate with user demands?

Geodata for the Masses (various)

I caught part of this talk late after briefly stopping in on the OSM editing and civic apps sessions. By the time I got there it appeared to have devolved into a talk on DIY data collection by kite and balloon.

Posted via email from The Pragmatic Geographer

Wednesday, August 11, 2010

"How do I get a GIS job?"

I got this question from a recent grad and I'm not sure if I gave a good answer. Or, for that matter, if there is much advice that would help someone graduating into this job market. One theory going around is that timing like that can hurt forever.

This was my response:

What kind of GIS job do you want? I see the GIS field as roughly divided into administrative, technical, and special domain-specific knowledge areas. You can specialize and in some cases earn more, or diversify and be able to potentially apply for a larger number of positions. Everyone falls a little into all of these categories. Decide to what extent you wish to specialize, and in what area.

  1. Administrative
  2. Your project managers, leaders, and communicators. Even the nerdiest technical expert needs to be able to properly gather objectives and communicate requirements. The deeper you go into this, the more likely you will be delegating the actual GIS work.

    You get better at this by getting better at written and verbal communication in general. I don't feel exceptionally qualified to talk about leadership, though increasingly I have been thrust into project management work.

  3. Technical
  4. GIS is a technical field. If you lack technical skills in a technical field you should fully expect to be mocked. This area can actually be further subdivided into the people leaning toward system admin/computer janitor work and dedicated developers.

    I can recommend some fantastic books if you were interested in development - starting with The Pragmatic Programmer, CODE, Code Complete, and Programming Pearls. Just reading isn't enough, I would look into contributing to open source projects.

    For the GIS admin/analyst types, I'd recommend working through the entirety of the book GIS for Web Developers. You can go from nothing to managing an entire open source GIS stack (which is fairly similar to ESRI's in terms of architecture).

  5. Domain knowledge
  6. Geographic information systems are usually serving some greater business need such as the mandates of public agencies or the profit center of a private corporation. Specialized knowledge of those needs are a career in themselves, and the more you specialize in this the more that career becomes yours. People that fall closer to this area exist on the fuzzy line between actual GIS professional and being a professional in something else.

I'm not going to claim I am the best authority on this particular topic, but I've had success with this particular mindset. A good discussion on what skills you need for each can be found on the new GIS -StakeExchange website here: http://gis.stackexchange.com/questions/883/gis-development-skills

Posted via email from The Pragmatic Geographer

Wednesday, July 21, 2010

Geometric Network Geoprocessing

I've been hammering away at a GIS migration project that has been going on far longer than anyone expected it might take, but I've gotten the opportunity to hack away at some Python to more or less fix a problem I found on TheGISForum:

I currently have a geometric network that I've been working with using the Utility Network Analyst tools. However, I've been trying to perform a few tasks that I've noticed the Network Analyst tools provide, but the Utility Network Analyst does not.

Specifically, I want to place barriers at all locations specified by a point feature class - a functionality provided by the "Add Locations" tool in the Network Analyst Toolset.

Is there a way to use the network analyst tools on a geometric network?

The answer, sadly, is no. As far as I can tell, ESRI has not released any geoprocessing tools for their Geometric Network.

But it is possible to roll-your-own solution to get the same kind of tools by using existing open source libraries. I have a newbish work-in-progress project along these exact lines.

Posted via email from The Pragmatic Geographer

Wednesday, April 21, 2010

WAURISA 2010 Conference

The Washington URISA conference is just about over, and it's made me realize how long it has been since I've gotten a chance to interact with the local GIS community. Some impressions:
  • The panels were probably the best part. I see now why the Barcamp/Unconference model has become so popular. Unfortunate no one from ESRI/AutoDesk/etc participated in these as far as I can tell - just the local/state government representatives and open source guys.
  • Best talk I heard was from Aaron Racicot on haiticrisismap.org and OSM. Even if you came in knowing the story, the energy of the speaker, quality of maps, and audience participation was excellent.
  • I was happy to see there wasn't too much stigma hopping between rooms to hit particular talks as long as you were quiet and professional about it.
  • It was nice to shake hands with some Well Known Nerds (WKN) of the open source GIS community.
  • Some excellent balance in conversations about the variety of solutions to a given problem. No one was especially partisan about a particular product, software, or method.
  • Vendors were not as aggressive as I had been warned.
  • The turnout of the CUGOS group was good. I didn't seen any equivalent ESRI or AutoDesk user group wandering around in a similar fashion.
  • Smart people giving poor presentations is heartbreaking. I know this might ring hollow coming from someone who didn't give a talk, but some folks should thumb through Presentation Zen again. Walls of text all over the place.

Thursday, February 18, 2010

Spatial Metadata: Why is it so hard?

Why can't my GIS data be as easy to organize/provide metadata for as my music collection?

I want a clean, detailed interface to explore my files. I want an indexed library for fast searches and simple playlist dataset/layer package/group/namespace creation. I want to change the owner, license, one-line description of files the same way I might change them in a Music folder in Windows Explorer, not by diving into some ugly wizard.

ESRI shouldn't throw out ArcCatalog, they should make it the iTunes of geospatial information. Move it from the smaller segmented market of GIS professionals to the mass market. Clean up the UI, throw up a "store" that connects with ArcGIS Online and the Resource Centers. Refashion the toolbox into components available for purchase from an in-application repository and open that repository to 3rd party developers (with a trusted [reviewed] and untrusted section).

Have it read every damn spatial format possible. Embed Webkit so you can jump to Bing, Google MyMaps, OpenStreetMap, WeoGeo, and everything else out there of geospatial interest. Metadata is largely a solved problem with my music in Foobar (music player). Why? Because of freedb, a license-free database of song metadata. There isn't any reason why we couldn't do something similar for spatial data.

Ideas are cheap and this is a lot of hard work, but it is something I'd like to see. There is a lot of moving parts to spatial data - a lot of sources, metadata, and datastores. It would be nice if we could abstract away all the stuff you currently have no interest in. The fact your spatial data is in PostGIS, or AGS, or in the cloud somewhere is unimportant when all you want to do is supply the key information of when it was collected and why.

Friday, February 5, 2010

Fun with SQL Spatial

This is old hat by now, but I love how much cruft can be killed by outright avoiding older APIs.

A unique location number needed to be generated for an engineering design tool. A vendor contact sent an example to work off that was about 400 lines of C# code. This little stored procedure replaced basically all of it.

CREATE PROCEDURE [sde].[IntersectGrid]
@x FLOAT, @y FLOAT
AS SET NOCOUNT ON
BEGIN

DECLARE @g geometry;
DECLARE @grid int;
SET @g = geometry::STPointFromText('POINT ('+str(@x)+' '+ STR(@y)+')', 2);
SET @grid = (
  SELECT TOP 1 [areaName]
  FROM GridTable
  Where @g.STWithin(Shape) <> 0
 )
return @grid
END

How do you make this 400 lines? Easy, use the ArcObject API to do the intersect. Instantiating dozens of objects, checking out/in licenses, and using reflection to read a config file (not sure why they didn't just use AppSettings) adds up fast.

I could probably even do it with even less effort using Shapely, but no one else is really familiar with Python in this situation (vendors or coworkers).

Monday, November 2, 2009

New State, New Faces

So, there has been a little change of scenery since I last posted. I got a job in Washington and didn't have to be in too much of a rush to move (unlike when we moved down to Arizona). It went something like this:
Northern Arizona
Hoover Dam Some part of California Some other part of California (you'll never guess). We traveled up the Oregon coast. I would heartily recommend this. Nearly as pretty as that girl. Now we are living in Port Orchard. It is rather nice.
Still settling in, so I will leave more details for another post. The summary: doing GIS work for Peninsula Light Co, a nonprofit coop power utility, enjoying the temperate climate, and taking the ferry to west Seattle on the weekends to see friends. I've been rather terrible about updating this blog recently due to the effort required in the move, but I intend to talk more about the area and the new job in more frequent posts.

Thursday, August 13, 2009

More graph tracing - this time for water quality.

My last interlude with tracing the water system involved looking for hydrologic hazards - specifically other potential sources of water that could confound maintenance efforts. Apparently I impressed someone enough with it to get a new trace-oriented project having to do with water quality. I'll have a second part to this talking about my first run at it and a subsequent refactoring that I'm extremely happy with, but first I wanted to mention how I was documenting it. As far as I can tell, the corporate standard for documentation is Microsoft Word documents. At best, these have a relatively easy to navigate table of contents and the document is stored at the same location as the topic. At worst, it has neither attribute or doesn't exist - there is no practical difference between those two situations really. No one will ever find them - which for development projects is especially problematic. The Don't Repeat Yourself (DRY) principle applies outside single projects. There were some things I really wanted for the documentation I was going to produce for the product.
  1. Fast, built in search. Amazing the difference this makes.
  2. Something you could put up on the web with little to no fuss, but wasn't actually a public facing website.
  3. Allowed all kinds of markup, images, other resources.
  4. Wiki-style editing - who changed what, when, and some measure of version control.
  5. Plaintext or in some easily parsed format.
  6. I could quickly convert it into Word if I caught too much flak for not using it
What I eventually settled on is called TiddlyWiki (Google, Bing, Yahoo!..I'm just certain there is a rule Web2.0 stuff requires a childish name for success). Its one HTML document, thus very portable, with a bunch of JavaScript that implements all the functionality I wanted above. It also apparently has a lively plugin community that I haven't had time to peruse.
Documentation is important to preserving the intent of any project.

Friday, May 1, 2009

Future of GIS Analysts, Part 2

In the first part of this post, I tried to put together a simplified list of some of the activities that I have found GIS analysts doing as a part of their job. The goal was to get folks talking about future roles and if GIS analysts have a future ten years from now. I was happy commentators posted some things I missed. I hadn't included, but was helpfully mentioned by geographygeek in the comments, the role of an analyst after the data has been collected and processed. Some (hopefully most) are trained to answer the question, "Well, what does that mean?" Another commenter, KindaSpatial (who puts out a rather good geography podcast), wanted me to talk a bit about some of the newer 3D and hyperlocal data and interpretation. I'm honestly not sure if I am qualified to, but I'll give it a shot and have people correct me later. I like to think of blogging as more of a dialog. On closer inspection, each of these should be their own blog post, so that is what I will do. For now, I will give my initial impressions by going down the list and examine each item, asking the same questions: Can this be automated? Is it easily outsourced Is another profession largely absorbing it? Here are my thoughts. They are not quite fleshed out, feedback greatly encouraged.
  1. Map production: Outsourcing: try to make or get a good, topic specific map via phone conversation. Automation: still a lot of overhead software knowledge required for the kind of quality maps necessary for professional reports. Professional designers have all of the aesthetic abilities necessary for this, but little knowledge of the pitfalls of cartography - maps are like statistics that are even easier to lie with.
  2. Requirements gathering: Probably impossible to automate, and you can read the hilarious results of trying to outsource it elsewhere. Increasingly the realm of project managers with enough GIS experience to know what is available/feasible.
  3. Feature creation and maintenance: The simple stuff can and will be outsourced or automated. Stuff that requires boots on the ground simply can't, and yes it is hard to tell the difference. I don't think the dimensionality of the data makes a significant difference here.
  4. General IT/helpdesk support: This really is the work of IT professionals, but for smaller firms or feudal departments it isn't going anywhere.
  5. Database/content management: All information has a location-based component, and this function exists as only so long as professional database administrators are uninterested in how the middleware (SDE and PostGIS I believe) works.
  6. Minor automation tasks: Prime candidate for professionals to do the automation - it ends up being cleaner and reusable.
  7. Post-processing interpretation: (hat tip: geographygeek) This dovetails nicely which what I think is the core of GIS - the visual display of quantitative information. What you replace a geostatistician with? It doesn't seem to easily fit in with other professions, the methods are unique enough to be difficult to automate for easy public use.
  8. Hyperlocal data: (hat tip KindaSpatial) The mass of data associated with this seems to require, not merely lend itself to automation. Turning that stuff into interesting visual works or meaningful statistics sort of falls under the previous point.
Based on any comments/corrections I get, I'll be going though these points individually. I haven't been at the GIS game too long, so I'd love to hear from some people from different backgrounds.

Future of GIS Analysts, Part 1

A question occurred to me as a result of a comment made by David Bouwman on Twitter. Folks were congratulating James Fee, who had just been offered an opportunity to teach GIS at the Arizona State University masters program.
Teach them that "GIS Analyst" will be a rare job in 10 years - just like "Database Analyst" is today.
My question(s): what do we mean by GIS Analyst in this context - what functions do they provide today that will be unnecessary or absorbed by other jobs? My official title is something like GIS analyst, though this kind of talk might be more disconcerting if I hadn't already oriented my career towards application development. But I know people I work with, professors, and certainly students today would be interested in discovering if this was in fact the case. If we were to properly investigate this matter, lets consider first what GIS analysts (which we can probably group with specialists, technicians, etc) do that makes them necessary today, and then, in part two, we can examine what that role might look like - if it exists at all - in the future. Here is a quick list of the roles I've seen played by GIS analysts:
  1. Map production: Your standard cartography work. Organizing the layers, layouts, titles, etc into an aesthetically pleasing package and either plotting/printing it off, or, more recently, publishing it as a service.
  2. Requirements gathering: Client communication, identifying potential solutions from user stories, project specification and some project management.
  3. Feature creation and maintenance: Gathering and organizing data from disparate sources, digitizing/COGO work. Associated documentation/metadata probably falls in here too.
  4. General IT/helpdesk support: Particularly the case when it is a small firm without a real IT department or person, or if that person/department is swamped, or doesn't know anything about GIS software, or IT's grasp on individual departments is tenuous.
  5. Database/content management: Organizing databases, particularly ESRI geodatabases - what belongs in a given dataset, should it be part of the network, etc. File management of documentation, supplementary data.
  6. Minor automation tasks: Modelbuilder, Python, VBA stuff. Almost any programming task where not knowing the basics of object oriented programming is not much of a hindrance (though it makes for terrible code).
I'm missing probably a hundred more things analysts do and I'd like to invite everyone to help me add to the list.

Saturday, January 31, 2009

From GIS User to Developer - Part 3.

GIS college education typically prepares students for the very basic entry level positions. Assuming you want to have a job five years later that isn't the subject of ridicule or eliminated largely by automation, you should consider yourself a lifelong student. And just what are these entry level GIS positions? Again, my experience in the entire field is limited to just a few years, but talking to people, it seems pretty standard. Many of them are georectification/image processing monkeys. Some are brought on as folks to help maintain larger datasets like municipal water/power lines, parcel boundaries, etc, assuming there isn't a process for the drafters to input in their designs manually. Still others are cartographers in the sense they make custom stylized maps on demand. There is absolutely an art to this and, with the proper motivation, you can make a real statement (or lie) by including, omitting, or tweaking the standard map components. The proper visual display of quantitative data can shift public opinion, lead to a medical breakthroughs, and are sometimes the cause of the international dispute. It really shouldn't be underestimated. The Advice If you find yourself in the standard entry level GIS position, or applying for them, you should be looking for the means to specialize. As far as I know there are three primary ways of doing this.
  1. Not Business
  2. Environmental analysts, land use planners, medical researchers; you name it, they are probably using some form of GIS. Increasingly even the analysis stuff is done by the actual specialists rather than contracted out to a GIS person. This is the result of progressively easier to use tools and their acceptance into mainstream use. You don't necessarily need to go back to school to do this; realistically, the extent of what you can do with GIS software is such that there is a place for people with just enough business knowledge to be the interface between those specialists and the tools. At least for now.
  3. The Data
  4. What features do you have, which ones do you need, how do they relate to each other, how will you organize them? Depending on the size of the organization, there might be a taskforce that is dedicated to answering these questions. It is likely they will be IT professionals, so that is the kind of secondary activity I would recommend.
  5. Applications
  6. Create the tools the other two groups use. The bleeding edge of this is the Rich Internet Applications (RIA) using Silverlight or Flash to make extremely fancy web based mapping applications. And of course, there are probably countless other developers working on in-house custom applications (via ArcEngine) or simply extensions to existing GIS software (usually ESRI).
At the moment I am sort of doing all three of these things, but pushing myself more towards the applications side. I'll have a much bigger chance to do so when I've wrapped up my thesis (Feb 13th defense, March 6th text deadline). Really, I am still new to a much wider GIS world and it would behoove people still doing the entry level stuff to check out some blogs written by people more knowledgeable than myself.

Friday, January 23, 2009

From GIS User to Developer - Part 2

So School I finished up a GIS oriented Geography degree from CWU a few years back. From what I have heard about other, similar programs around the United States, my experience was typical - some general GIS/Computer Science history, labs following relatively closely to ESRI published workbooks using the ubiquitous ArcInfo suite, some quantitative/statistical methods, and some remote sensing stuff. I was lucky enough to do some special projects for actual state and local agencies who wanted some free/cheap maps with perhaps a smattering of analysis work. I thought I would just walk out of the college doors and fall into a low/entry level GIS job, and I might have been able to. But I met a lovely girl, who I ended up marrying, so luckily I found a pretty good excuse to stay around the college until she finished.
I don't really have a relevant picture for this section so here is CNN embarrassing themselves.
And Yet More School Up until I came down to Phoenix, everything I learned about GIS came from Dr. Robert Hickey. He was, at the time, the only member of the faculty to know much about GIS and apparently I had annoyed him enough with not-always-easy to answer questions about his courses. This left the probably mistaken impression I was more intelligent than I actually was at the time. I was curious and motivated. I liked the work and wanted to master it to the extent that was possible. This basically remains true. Anyway, this annoyance/mistaken impression factor got me an assistantship running the GIS lab at CWU. This was basically a helpdesk/IT role that, along with some independent contracting, paid my way through grad school. I get to defend in a few weeks. The Advice The only information I left the school with in terms of automating common GIS tasks was using ModelBuilder. Don't let this happen to you. At the very least, pick up SQL. Even if you don't want to become a developer and intend to live the haunted existence of a business analyst. If you are doing GIS classes you pretty much already have the necessary logic to grasp SQL. Done an attribute query? Then you've basically done half of an SQL SELECT statement. It is a "language" with a ridiculously small amount of actual syntax but can be extremely powerful. Also, it exists in every database in the world. Also, you can be justifiably ridiculed for not knowing it. Assuming you don't take a minor in Computer Science, which I would highly recommend at this stage, try this
  1. Make your ModelBuilder runs. Make them as complicated as you want.
  2. When finished editing, go to File:Export:To Script. Choose Python, you'll be happier learning JavaScript later
  3. Read this basic tutorial and try to make small changes to your script. I know I complained about documentation just a post ago, but mostly because I have otherwise been spoiled rotten by how good ESRI's usually is. Read that too.
You can learn a lot of the basics of object oriented programming from this stuff. There is an alternative, darker route involving the use of the VBA editor in ArcMap. Nothing you make there will make you proud later, and it is possible your future children will be subject to your subsequent crimes.
Some computer science is highly recommended.

Wednesday, January 21, 2009

Brief interlude: What the hell ESRI?

Listen this makes no sense:
  1. The documentation for your VE and GMap JS extensions - which appear to be roughly identical in function - have documentation that contains very different information. To figure out details on the VE.Geometry.ProjectToVEShapes method, I had to hunt down the similar function in the GMaps documentation.
  2. Stop pretending these are really REST functions. Be a little more clear about which JS functions are just SOAP requests you later make into JSON clientside. ProjectToVEShapes demands a proxy for larger requests, which means it is running into browser security features meant to prevent cross-domain scripting attacks (detailed a bit here). Oddly the demand for a proxy was only returned when I tried it in IE, in Firefox I had trouble finding it even with Firebug.
  3. The JS objects you return from ArcServer queries differ in format to the JS objects you require for functions in the aforementioned extensions. If I make an ArcServer Query for some feature class, the object it returns is not valid as an input to Project, for example.
These API extensions, and ArcServer itself, is pretty new. I can understand some of the documentation stuff, but could you be a little more consistant about the JSON formatting? Converting between the two is a bunch of unnecessary work. Am I wrong here? I'm pretty new to this, I could be missing a simple solution or not understand the problem.

Tuesday, October 21, 2008

Flex vs JavaScript for created online maps

An interesting discussion over at James Fee's blog on this subject. It looks like more folks are going with Flex for some very simple reasons: 1. Sure, its dedicated IDE costs money, but so does Visual Studio. 2. Adobe Air allows for movement of the app to the desktop. 3. Browser compatibility, particularly with IE, is a massive problem and Flex sidesteps it. 4. Reported better performance. Not to repeat the whole thread, but the most interesting comment was revolving around how Adobe has done what Java was designed to do, but couldn't - get people to install a plugin that was basically browser agnostic and had more functionality than JavaScript (credit to Matt Giger). Update: More commentary from the developers of the Geocommons website. Geocommons is worth a whole post I'll probably do sometime soon, it is pretty nifty and uses Flex.

Saturday, October 4, 2008

GIS In the Cloud

Cloud computing has been the hot topic in many GIS circle for the last year or so, largely for the same reason it is building steam more recently in most IT circles in general - datacenters and bandwidth speeds are nearing the point where the promise of mass cloud computing is feasible for corporate users (which is where the money is). For anyone previously not versed in this topic, cloud computing is basically the movement of applications, services, and data from local storage to massive datacenters run by people like Google, Yahoo, Amazon, and Microsoft. You probably use it already - say if you use GMail rather than a local application like Outlook or Windows Mail. Maps have obviously moved there too. If you own a computer you probably use some kind of online map for directions. It doesn't necessarily need to be in a browser either - Google Earth and NASA's WorldWind might be local applications, but all of the data and services are running off of datacenters somewhere else. It is believed this somewhat slow progression is going to accelerate as activities typically preformed by local IT departments for small and medium-sized businesses are increasingly replaced by cloud apps offered for basically free by the above organizations. It might seem foolish to marry yourself to a particular platform or business in this fashion but (1) a lot of these things are built on open standards like LAMP anyway and can be transferred around and (2) companies marry themselves to a vendor all the time (see SAP). Traditional desktop software vendors are shifting to do their stuff at least partially as a service (and thus online). Windows 7 isn't going to have a mail program, it is going to fill that functionality with Live Mail. ESRI, the biggest GIS software vendor, has made it a point to make it extremely easy to put online data services into map documents just as you would add in layers on a local computer. A great number of GIS data providers, largely governmental, are not by and large going to venture into the cloud all that soon, not without intervention by legislators. Why? Liability, tradition, data sensitivity. What data public exists is often of variable quality, especially when overlaid with other information from other sources. Throw some bad data out there, even if you include metadata that includes a hefty disclaimer, morons will still use it to hike into a blizzard and and sue you for having to eat their children. Even with good data, there is the problem of interpretation. Take a parcel layer from any given city/county government in the United States and throw it on Google Earth. GE does a pretty good job, but I would wager good money the satellite imagery isn't accurate to a quarter of an inch. The parcel layer is, by law. The number of people who will go on to Google Earth and stir up property disputes without this knowledge is probably substantial enough to be a factor in deciding to release it. And of course there is the security issue. Knowing what substation can black out a particular city block, what water main is feeding that block, communication lines, emergency vehicle GPS locations - this stuff could be used not just by some existential terrorism threat but by normal criminals to cause all sorts of mischief and evade capture.

Tuesday, September 16, 2008

Programmatically tracing a network in ArcGIS

Recently I was tasked with tracing a water network, and after studying a previous programmer's work on the subject, I realized the solution is just a modified tree algorithm that is largely language independent. The trick is a recursive function that continues to call itself until it has gone down the totality of a single path. When it can't keep going - because it reaches the end of the line or because you convince the program it is the end of the line - it hops back one function call and tries to take an unvisited path. If it finds one, it repeats the whole process. The ArcObjects API contains the extensions for Network and Utility Network tools, but as far as I can tell,those are mostly for listening for trace events rather than starting your own. Since a network like that is just a modified topology (it inherits from ESRI's topology object), you can use ITopology to create a TopologyGraph which allows access to the Edges and Nodes required. But you don't need to use C#/VB.NET/VBA to pull this off. It would be possible with the API ESRI provides with its Python scripting object (IDispatch), as that contains start and end points in its Geometry object, or by creating your own node/tree structure. I understand this is a fairly elementary use of recursion, but as a novice programmer its a lot of fun putting it into action for a real project.

Friday, August 15, 2008

Poor design

Sorry I haven't updated in a while, I have been busying myself with some reading and general related work. More and more I am putting myself into the position of a GIS developer rather than an operator - creating tools with ESRI's ArcObjects API. Whether this is "real" programming or not depends, I suppose, on your definition. I still consider myself something of a novice to the actual field of computer science, but I feel comfortable creating custom scripts and operator tools for ArcGIS in C#, Python, VB.NET, and T-SQL. I could say I am going to the trouble of learning all of this because it increases my earning potential, but it's actually a lot of fun. Computers are rather poor at what the human mind generally does - analysis and higher order modeling - and human minds simply cannot match the power of computers to instantly recall and organize memory (not to mention their potential for networking with the vast knowledge of the internet). Interfacing with a computer at progressively lower levels greatly increases the collaborative power that comes as a result of normal use, and it is a joy to be working in such a way.

Tuesday, July 1, 2008

Taking a picture of an image without using light that interacted with it.

When you look at something - any object - your eyes are reading information from all of the light reflected off that object. A leaf appears green because it absorbs or scatters all other kinds of light.

Light actually consists of individual particles called photons. An odd thing about particles - if you "entangle" one particle with another, they still are effected by one another over great distances. Change the spin of one, and the other one reacts. It doesn't seem to make too much sense, but that is quantum physics.

So what happens if you could capture the information of light particles that are hitting an object, but you only have contact with their entangled friend? Turns out you can make an image out of it anyway. The information is, I am told, only useful if you get the other one back too, but it is still pretty nifty. Below is an image of a toy soldier they viewed by this indirect method:



This will be sort of a big deal, as in the future, it might allow for the ability to see through all sorts of things. I know is seems strange, since we only see a cloudless Google Earth whenever we want, but satellites still need clear days for good imagery. This might be another way of mitigating those pesky clouds.

The artificial barrier of licensing, a GIS/Surveying example

There was news recently of a feature story pulled from a professional survey magazine because the work in question was, according to the State Licensing Board, depicting activities that should only be done by licensed surveyors rather than GIS professionals or anyone else. James Fee doesn't know what part of this mess is the worst, but I'd like to take an amateur stab at it. The artificial barrier that government licenses produce can in fact be a good thing when the occupation is such that a minimum standard is required to avoid large-scale fraud, but in so many cases could probably be done by private organizations. Many are put in place by a vocal and monied minority in an attempt to create what amounts to a cartel. I believe one example involved a manicure license that costed thousands of dollars. Professional survey licensing may fall under this, but my limited knowledge of that industry compels me to limit such rhetoric. This isn't the biggest problem with what occurred. Actually, the biggest problem wasn't even the suggestion GIS professionals - or for that matter, simply knowledgable members of the public with increasingly cheap GPS devices - are not competent to do location based field work. For so much geographic data, a 10m resolution is a godsend where previously no one was collecting data. And as more nations put up satellites it will likely become even finer resolution for smaller and more casual devices. This is not to suggest there isn't a place for professional surveyors - both for ultimate liability responsibility and expertise with the more effective tools and procedures. But the percieved elitism is somewhat disconserting. As a GIS/programming guy, I don't find anything I do so absurdly difficult that it would pain my eyes to read an article about an amatuer trying it. The biggest problem is that a state licensing board can effectively kill an article. There isn't any way in which this is a good thing, and I hope whoever has their hands on it leaks it to the internet.