arcpy.ListFeatureClasses() Bug

I was recently re-evaluating our back-up procedures and discovered and found a nasty bug with the arcpy’s ListFeatureClasses request. If you have a feature class in a feature dataset with the same name, ListFeatureClasses may not find it or anything else in that feature dataset.

Unfortunately, we recently made our daily backup a python-based system that uses ListFeatureClasses and got bit by this bug.

After discovering missing data in our backups, I reconstructed what happened and found this bug. Below is arcpy code that iterates through the feature datasets in a geodatabase and lists the feature classes:

import arcpy

def copyAll():
    for iFeatureClass in arcpy.ListFeatureClasses():
        print(" Feature Class: {0}".format(iFeatureClass))
    iFeatureClassFull = None

testGDBname = "mgs_sandbox.sde"
arcpy.env.workspace = testGDBname

copyAll()
for iFD in arcpy.ListDatasets("","Feature"):
    print("Feature Dataset {0}:".format(iFD))
    arcpy.env.workspace = testGDBname+"/"+str(iFD)
    copyAll()

And here is a screen shot of the contents of a test enterprise geodatabase, you’ll see it has a feature data set named “outcrops” that has a feature class also named “outcrops” within it:
sandbox

And the results list only the feature dataset:

results

But if I rename the feature dataset (e.g. outcrop_fd), the results are what I would hope for:

results results2

I found that the feature class does not even need to be within the feature dataset and also the problem does not always occur, I have had the code successfully run in some cases.

Once I confirmed the problem, I did find this thread from almost three years ago that mentions the bug. One poster indicated the same thing occurs in ArcObjects which leads me to think something may not be getting registered right in the sde tables.

I was not able to re-create this using either personal or file geodatabases.

So I adopting the policy of not using the same name for a feature dataset as for a feature class.

 

Domain Sorter Add-In Version 1.1

Almost a year ago, I updated ERSI’s Domain Sort code for VB 6 to work with ArcGIS 10. Recently, I had a comment that this Add-In caused ArcCatalog to explode if you had an open OLE connection. When I tested it, it turned out the reports were accurate.

I got around to adding in a Try-Catch around the offending chunk of code & it is now better than ever. You can download just the Add-In or the Add-In with source code or get it from ESRI’s ArcGIS Resource Center.

Feature classes and Tables with names starting with “nd_”.

Random luck me to discovering a bug related to feature classes whose names start with “nd_”.  It appears that you are allowed to create feature classes starting with “nd_” but ArcCatalog will not display them.  Further research shows this behavior also occurs for table and for ArcSDE (PostGres) geodatabases,  personal geodatabase, and file geodatabases–I am using ArcCatalog 10.0.

I first noticed something odd was occurring while importing a series of shapefiles into a geodatabases.  After importing 15 shapefiles, I only had thirteen feature classes despite receiving no errors during the process.  The two shapefiles that failed to import were named ND_oil_and_gas.shp and ND_Bendix_Study.shp.  Subsequent attempts to import them individually returned an error “Invalid Target Name”.

I discovered in pgAdmin III (Postgres SDE Geodatabase) that the table existed and there was an entry in sde.sde_layers for the feature class but ArcCatalog refused to show it.

I used some un-supported methods to try to resolve the problem and despite some sweating, I failed to find a way to get ArcCatalog to display these feature classes.  I did, however, at least found a way to delete them–arcpy can detect that the feature classes exists so it is able to delete them.

At least by deleting them, I can prevent leaving “invisible” feature classes from hanging out in my geodatabase.

I suspect the problems stems from how ESRI has implemented the Network dataset table-naming structure –dirty areas are stored in tables named nd_<itemid>_dirtyareas  and nd_<itemid>_dirtyobjects.  Possibly the developer  working on the ArcCatalog GUI ended up suppressing showing feature classes and tables whose names start with “nd_”.

And, just for posterity’s sake, here is a python code snippet listing the feature classes in a workspace:

import arcpy

arcpy.env.workspace = “c:/temp/_nd/F.gdb”

print arcpy.env.workspace
for fc in arcpy.ListFeatureClasses():
print fc

print “Done!”

Extract Values to Points (Spatial Analyst) Bug

One of the Spatial Analyst tools we often use in ArcGIS is the “Extract Values to Points” tool.  This allows us to take a point file (well locations in our case) and attach a value (elevations) from a raster image (a DEM) to each point.

Today I was running it for the first time against an Image Service we recently published and I received a warning message,”WARNING 000957: Skipping feature(s) because of NULL or EMPTY geometry”.  But the script seemed to run and the final results said “Succeeded” so I thought it was probably fine.  But as I double-checked, I realized the results were wonky.

Turns out that I had two records with Null geometry in my point file of 397 records.  These two records threw the above error but actually had a value in the [Rastervalu] field.  Turns out all 397 records had values.  These two records were consecutive–let’s say the 100th and 101st records in my shapefile.  What happened is record 100 got the value for record 102, record 101 get the value for 103, record 102 (which has valid geometry) had the value for 104.  This pattern, each record having the value for the record 2 place after it, continued until record 396 which had the value for record 397.  Record 397 also had the value for 397.  So the final three records all had the value for the final record.

What I would have expected would be for the two records with Null Geometry to have null values in the [Rastervalu] field and the rest of the records to have the correct values.  Despite the warning, it is very misleading for all the records to end up with a value.

I have a simplified example below.  I made a point shapefile with four records.  The first, third, and fourth  records have valid geometries; the second has Null geometry.  The second record ends up with the value for the third record.  The third record, has the value for the fourth.  The fourth record being the last record, ends up with the last valid value, which was its own.

The results that I would have hoped for would be for the third record to have a Null value.

The way I envision what is occurring behind the scenes is this:  the process makes a list (more of a stack in programming terms) of result values as it processed the points but just assumes that every record will return a value so it does not track which value goes with which shape.

When it reached the two null geometries, it threw an error but continued on.  It did not add a value for these records to the stack of values–when it comes across records with valid geometry but do not intersect the raster it adds a psuedo-null value of (-9999) to the stack.  After it processed all the records it had 395 values in the stack.  It then went, one-by-one through the stack and populated the records in the output shapefile, the first record got the first value in the stack, the second record got the second value, the 100th record got the 100th value (which came from the location of the 102nd record) and so on.  At the end, the final two records received the last valid value.

This final behavior–using the last valid value–corresponds a bit to a behavior we’ve seen with ArcObjects in general.  When iterating through a table, if a field is Null for a specific row, the value from the last non-Null value for that field is often returned.

I’m in the process of submitting a bug to ESRI.  I’m not sure if this existed prior to ArcGIS 10.0 (I’m guessing it did) or if it occurs in other processes (I’m guessing it does).  I did find out that the “Extract Multi Values to Points” works as expected.  I’m guessing it is because unlike the “Extract Values to Points” which creates a new shapefile, this tool appends fields to the existing shapefile and presumably processes records one-by-one without putting the results in a virtual stack.  The “Extract Multi Values to Points” tool also does not throw any warnings.


Multiple geodatabases and ArcSDE services from one PostgreSQL server.

To better organize our ArcSDE data, we wanted to create multiple geodatabases and multiple ArcSDE services using one PostgreSQL database cluster (a cluster containing 1 machine at this point).  A side question is why can’t tables and raster be placed in Feature Datasets?  This wouldn’t be an end-all solution for what we want to do but there are some messy consequences of this limitation.

ESRI has instructions on Setting up multiple geodatabases in one PostgreSQL database cluster on Windows which was helpful but we repeatedly got an “The ArcSDE Repository was unsuccessfully completed.  Would you like to view this error?” error during the Repository Setup step, step two of four, of the ArcSDE Post Installation process. The odd thing, however, was that the wise_err.log was blank so I had no clue on why it was failing.  After much Googling and head-banging, I came across a post from Kim Doty on ESRI’s forums reporting the same problem but they were able to create a service by just continuing through the process .  Figuring I had nothing to lose, I thought I might as well try to go through the entire ArcSDE Post Installation process and see what happened.  Although I received blank error messages, I was able to successfully create the second ArcSDE instance.

I will touch on some of the highlights of the process.  In my case, I was creating a database & service with these parameters:

  • Database name: mgsgdb_lidar
  • Service name:  mgs1_sde
  • Tablespace folder: D:mgsdb1dmgsgdb_lidar

Following ESRI’s instructions, I first made custom copies of dbinit.sde, dbtune.sde, and giomgr.defs that contain the service name (mgs1_sde) within the file name.  These copies were named dbinit_mgs1_sde.sde, dbtune_mgs1_sde.sde, and giomgr_mgs1_sde.defs.

When I have trouble with the ArcSDE Post Installation, I like to go through the four steps individually so I select a “Custom” install.

The four steps making up the ArcSDE Post Installation process are shown on the next dialog.

I was able to complete the first step, Define SDE User Environment, without any problems.

Make sure to set your database name, default tablespace, and tablespace folders are distinct from your first installation.

During the second step of the ArcSDE Post Installation process, Repository Setup, make sure to use the custom files you made earlier when you come to the ArcSDE configuratino files dialog.

Later in the Repository Setup, make sure to use the correct database name:

This is the first error message I received in the process (still during Repository Setup):

Clicking “Yes”, however, shows an empty wise_err.log file:

After viewing this, I canceled out of the Repository Setup step.

Taking a look in pgAdmin III, however, I can see that both my database (mgsgdb_lidar) and tablespace (mgsgdb_lidar) were created:

I received the same error message during the third step, Authorize ArcSDE, as in the second step:

But running the fourth step, Create ArcSDE Service, resulted in this sweet message:

After editing my pg_hba.conf and opening the port in my firewall, the service was created, running, and visible.  So far, it seems to be fully functional and without problems.

Walkthrough: Building custom UI elements using add-ins (ArcObjects .NET 10 SDK)

I was working my way through this ESRI Walkthrough: Building custom UI elements using add-ins (ArcObjects .NET 10 SDK).  And came across a couple minor errors that I had to correct during the process.

First, while implementing the OnClick() code for ZoomToLayer.vb, Visual Studio gave me a “Name ‘ArcMap’ is not declared.” error.

In the walk-through, they mention that the ArcMap method of your class.  For me, however, it appeared under the .My method.  Not sure if this is something specific to my set-up or, as I’m guessing, something that got changed after the first documentation was created and the final libraries published.

The fix is just adding  “My.” to the namespace in this line:

ZoomToActiveLayerInTOC(TryCast(ArcMap.Application.Document, IMxDocument))

To get this:

ZoomToActiveLayerInTOC(TryCast(My.ArcMap.Application.Document, IMxDocument))

When I added the code for AddGraphics.vb, I got 8 errors.  There was essentially two errors, repeated four times.  I took a screen shot after fixing the first error pair:

The fixes in this case was also to use the complete name space path.  Examples:

Change this:

(geometry.GeometryType) = esriGeometryType.esriGeometryPoint Then

To this:

If (geometry.GeometryType) = ESRI.ArcGIS.Geometry.esriGeometryType.esriGeometryPoint Then)

And change this:

simpleMarkerSymbol.Style = esriSimpleMarkerStyle.esriSMSCircle

To this:

simpleMarkerSymbol.Style = ESRI.ArcGIS.Display.esriSimpleMarkerStyle.esriSMSCircle

Overall, the walk-through is very well done, just a couple minor tweaks.  I am now working my way through modifying an existing solution–one that included seven projects–to see if I can create an ArcGIS 10 Add-In.

ArcGIS 9.3.1 to ArcSDE 10 Connection Error

We finally installed an instance of ArcSDE 10 today.  My first attempt at connecting in ArcCatalog 9.3.1 failed with the following error:

Failed to connect to the specified server.

This release of the GeoDatabase is either invalid or out of date. [Please run the

ArcSDE setup utility using the -o install option.]

DBMS table not found[sde.sde.GDB_Release]

 

Turns out the solution was simple, this article points out that Service pack 2 is required.  After updating my ArcGIS, the problem was solved:

I was the proud, new recipient of a non-misleading error message.  After some minor head-slapping, I realized that 9.3.1 and older versions of ArcGIS are not able to connect to ArcSDE 10.  Doh!  My bad–I should have read ESRI’s clear information on this topic.  At least I never actually tried to install ArcSDe with the -o option as the first error message instructed me to do.

Workstation Arc/Info Interchange Import Error

I have been working on updating some data that was published using Arc/Info Workstation Export (.e00) files, converting them to shapefiles, or in the case of annotation, to a geodatabase feature class.

While converting some INFO tables, I have received this error:

Unable to encode the 38th item in the 65th record for the INFO file

It turns out that Workstation can create export files that contain values that did not fit within their field–I’m guessing that somehow you are actually able to enter invalid values into those fields but I did not confirm that.

The error, “Unable to encode the 38th item in the 65th record…” was only partially helpful.  The error was in the 65th record but I was not able figure out which field contained the error.  The nature assumption that it was the 38th column was incorrect, at least if you are counting from left to right.  Whatever logic that is used, is consistent, at least within a specific table, because I had numerous records with problems and when the error messages indicated it was the same item, it did turn out to be the same field.

What I ended up having to do was open the export file in my favorite text editor, locating the info for problematic record and go through the columns one-by-one, comparing the value in the text file with that in the table.  In this case, 14.1 was suppose to be the value in the WP_TIO2 field.

…”

One thing I noticed, and this is consistent with behavior I have observed in ArcObjects in other cases, is that the record with the invalid value in a field ends up with the same value in that field as the previous record.  So in this case, the bad record has a value of 1.12 in the WP_TIO2 field because that is what previous record has.   That leads me to believe that ArcObjects is re-using a variable as it does the import, just updating each value, field-by-field.

Just to make sure that I had diagnosed the problem right, I attempted to change the value to 14.1 in an ArcMap edit session and got this message:

A look at the field properties confirmed that the field would not accept 14.1 as a value. The precision (3) means the field can hold 3 digits with 2 (the Scale) of them being right of the decimal point–leaving only one digit available to the left of the decimal point.

Once I identified the problem, I used two methods to correct it. The majority of my cases were actually cases where a psuedo-null value was being used to indicate that a field was really Null. In some cases, either -999.9 or 100 were being used. In these cases, I just substituted a different psuedo-null value (-99.9) that conformed to the field limitations. The data I was working with was not tightly controlled in the past and there were other existing records that used these value for the same fields, so I was able to do this without much concern.

Some values, however, were not nulls and I felt needed to be retained. These, I had to add a new field that could contain the values, copy the values over and then edit the ones that had caused problems during the import. Once complete, I exported the data to a new table with the new table structure I created.