Hi Robert -
Currently there are 2 methods of extracting Feeder Information via Feeder Manager 2.0 for use outside of ArcMap; Geoprocessing Tools and the FM2.0 public API. Network Adapter is also fully compatible with FM2.0.
Depending on the size of the database, an entire extract of the data via our Geoprocessing Tool may be the most effective way of obtaining Feeder Information. I would give this a try to determine how long it will take for the particular database you're working with. I know Glenn Farrow and Kyle Erickson have experimented with this approach and may have some additional comments to add here.
As this is the initial release of FM2.0 we have some additional functionality in mind that we would like to add to FM2.0 to make it easier to extract changed feeders. The primary way to address this will be a "dirty" feeders list that will track changed feeders over the course of edit operations. This list could then to leveraged to help facilitate the exporting of Feeder Information via our Geoprocessing Tool that is currently available at the 10.1.1/10.2 release.
Another option that we have been exploring is a way to write FeederID back to the Geodatabase as part of a batch reconcile/post process. This would allow clients to take advantage of the benefits of FM2.0 while editing and working in ArcMap, but would allow existing customizations/processes/integrations to continue to work because the Feeder Information would still be stored on the feature (just not updated after each and every edit). This idea is still in the prototype/design phase and we're trying to figure out the most effective way to do something like this.
If you have any other comments/ideas/suggestions about how to improve the integration capabilities of FM2.0 please feel free to give me a call. I'm very interested in hearing about use cases and workflows that we can improve.
One final note...if you're going to experiment with the FM2.0 public API then I would recommend grabbing this patch first.
Thanks for the info. I've talkd with Glenn Farrow was the one who turned me on to the FGDB approach. The approach seems well suited to larger change sets (daily or weekly extracts), but I'm more inerested in the approach you've mention you guys are looking at with batch reconcile/posting. I like the idea of doing this using a batch/reconcile post process, and its the approach I'm going to take to try and solve this problem.