Hello GIS Studio followers! Wordpress tells me there are 200 of you? Seriously? Well let’s put that power to work.
I am creating a new half day workshop designed to cover intermediate python topics – as they relate to GIS tasks.
Here’s an outline of the topics so far
- Setting up a code template with built in tools to get you up and coding faster
- A peek inside a more powerful IDE editor and why you would want to use it.
- Using GIT – what is it (revision control software) and how to set yours up
- A look behind the scenes at task automation with Python – what powers a local government every evening for GIS cleanup
- Advanced ODBC / arcpy data access module connection strings
- How to use ST_Geometry in your queries to run spatial queries without ArcMap
- Zipping files in python
- Logging to a text file or SQL database
What is missing? What would you all like to get out of an intermediate class?
It will be lecture format half day – unless we generate a long list and then maybe we will shoot for an all day class!
In my last post on time, I mentioned that I wanted to be checking in on whether features in GIS data had been updated or not. If it had, I wanted to copy the data that had been modified and record it in an archive. Originally I thought I would just check in every 5 minutes, or 30 minutes, or whatever the client wanted. Then I started thinking about all the things that could go wrong. If it was every minute, and the process of archiving the data took longer than one minute, the script wouldn’t run again until the first process had finished and I could potentially miss the changes that happened during that slack time. Or what if it failed entirely and then I lost track of when the data had last been checked? Or… I’m sure my mind wandered off into all the things that could go wrong.
So I thought a better way to handle this was to record the date and time the the data had been checked in a little text file and then read it back the next time the script ran. The file acts as kind of safety net in case things don’t go as planned.
Now the script can be run as often or as little as needed and all the data should be caught.
Here’s how I did it…
Tick tock…. time is a real bummer now that Esri implemented UTC time being recorded in the time tracker settings. Now the create date and last edit date are recorded in a time zone that most of us don’t work in. ( I understand why, I just want to grumble a little bit.) So… that opened up a new opportunity to learn about time shuffling in python.
What I wanted to do was write a script that was going to be run every so often. Maybe every 5 minutes, maybe every 30. It needed to check for changed data in an SDE feature class and archive whatever changes had been made.
Originally I wanted the script to figure out what time it was five minutes ago, but in UTC time so that I can query the last edit and last modified date out of a feature class stored in SDE. I did a lot of reading about time formatting. The most straightforward explanation I could find was here: http://www.doughellmann.com/PyMOTW/datetime
The best thing I have to tell you is that there is a built in UTC converter in Python! Hooray!
It works like this –
Sometimes you just want to know where things are in the script when you are debugging. Or where things came from. You can pass that information or report it while debugging using this snippet. Easy, simple.
"""Returns the current line number in our program."""
print "this is line", %lineno()
print " the line number, %s, is in the middle of this sentence"%lineno()
i = type(lineno())
results in :
this is line number 8
the line number, 10, is in the middle of this sentence
Note that the line number is type integer.
Thanks to the Danny Yoo and the Active State recipes for this easy tip.
Calculating the latitude and longitude of a point that is not in a geographic coordinate system was a tricky thing for me to figure out one day.
My data was not in the “World Projection” (as ArcMap calls it) so when I finally figured out how to access the x,y coordinates of the point, they were not in Lat/Long. So basically I reproject the file, extract the x,y coordinates, then reproject back to State Plane which is what I need. Not too elegant, but it gets the job done.
Here’s how I accomplished it:
I just solved a problem that has perplexed me for a week. I have banged my head and the heads of my colleagues on an issue that has pretty much halted a project. It was solved today because someone 4 years ago posted a little note on the ESRI forums and I am so grateful to them I would like to hug them. Thank you Rafael Ferraro! I would thank you in the forums, but the comments have been closed in the old archived forums.
So this brings to mind the importance of each and every one of us participating in our communities. Each voice is
Here’s a rough script (but it does work) that reads EXIF information from already downloaded photos (tested with an iPhone) and creates a file geodatabase using the latitude and longitude stored with the photo.
You have to have had your GPS active while taking the photos. The pictures are moved into the file geodatabase and a hardcoded path is stored. Eventually I’d like to calc that as a relative path and leave the photos where they are. The benefit to moving them into the geodatabase means they won’t get lost, but you duplicate the photos and increase storage needs.
The settings for running the script to accept arguments, or to run with a hard coded path, are near the bottom. Be sure to modify the paths to be what you want.
Click on through to get to the script: Continue reading