2 Replies Latest reply on Nov 22, 2013 3:10 PM by Robert Krisher

    Change Detection in Feeder Manager 2.0?

    Robert Krisher

      Pre-Feeder Manager 2.0 we could rely on the ESRI versioning technology to determine when an feature was being fed by a different feeder.  Because feeder information is now no longer stored on the feature, it is no longer subject to the ESRI versioning framework.  This is great from a performance perspective, but it seems like its likely to cause problems with any code that was relying on those edits to extract feeders.  1) What is the recommended solution for this problem?  2) Have we documented this solution to ensure that other vendors (Cyme, Oracle NMS, etc) are able to still integrate withour software?

        In the interst of partially answering my own question, it seems like the current solution is to not use feeder manager 2.0 (or maybe to have it turned on, but still keep the feeder fields in the physical model).  This would allow all pre-existing feeder extraction / change detection technology to continue to function as it had in the past, but limits the benefit provided by Feeder Manager 2.0.  For completely custom implementations, it looks like one approach would be to dump out all of the feeder informatoin to a staging database where it can then by diffed with a previous extract.  While this process works, it seems like a full-db compare in order to find changed features is an inefficient solution that is likely to either not scale well or completely fall on its face with larger datasets.  3) Is there a more clever way to achieve this?  Could we use the IFeederFields interface be used to achieve this (through some careful design and planning).

        • Re: Change Detection in Feeder Manager 2.0?
          Matthew Crooks

          Hi Robert -


          Currently there are 2 methods of extracting Feeder Information via Feeder Manager 2.0 for use outside of ArcMap; Geoprocessing Tools and the FM2.0 public API. Network Adapter is also fully compatible with FM2.0.





          Depending on the size of the database, an entire extract of the data via our Geoprocessing Tool may be the most effective way of obtaining Feeder Information. I would give this a try to determine how long it will take for the particular database you're working with. I know Glenn Farrow and Kyle Erickson have experimented with this approach and may have some additional comments to add here.


          As this is the initial release of FM2.0 we have some additional functionality in mind that we would like to add to FM2.0 to make it easier to extract changed feeders. The primary way to address this will be a "dirty" feeders list that will track changed feeders over the course of edit operations. This list could then to leveraged to help facilitate the exporting of Feeder Information via our Geoprocessing Tool that is currently available at the 10.1.1/10.2 release.


          Another option that we have been exploring is a way to write FeederID back to the Geodatabase as part of a batch reconcile/post process. This would allow clients to take advantage of the benefits of FM2.0 while editing and working in ArcMap, but would allow existing customizations/processes/integrations to continue to work because the Feeder Information would still be stored on the feature (just not updated after each and every edit). This idea is still in the prototype/design phase and we're trying to figure out the most effective way to do something like this.


          If you have any other comments/ideas/suggestions about how to improve the integration capabilities of FM2.0 please feel free to give me a call. I'm very interested in hearing about use cases and workflows that we can improve.


          One final note...if you're going to experiment with the FM2.0 public API then I would recommend grabbing this patch first.


            • Re: Change Detection in Feeder Manager 2.0?
              Robert Krisher

              Thanks for the info.  I've talkd with Glenn Farrow was the one who turned me on to the FGDB approach.  The approach seems well suited to larger change sets (daily or weekly extracts), but I'm more inerested in the approach you've mention you guys are looking at with batch reconcile/posting.  I like the idea of doing this using a batch/reconcile post process, and its the approach I'm going to take to try and solve this problem.