4 Replies Latest reply on Jan 4, 2017 7:30 AM by Frank Bailey

    Loading Large Data Sets Into ArcFM Model

    Frank Bailey

      Hi my name is Frank Bailey and I work for Central Hudson Gas and Electric. We've just completed generating a complete set of secondary features in a file geodatabase and now we are attempting to load the features into our ArcFM model. We've compressed database and are attempting to put together the loading procedure that we'll need to perform in production. The loading is pretty slow and I have a feeling this is related to the database being versioned. We'd like to un-version the data and do the loading with Object Loader directly to Default. My question is does un-versioning and then re-versioning the database break or otherwise impact ArcFM in any way? I know there's also a script to disable auto-updaters from running, should we consider doing this as part of the loading process as well? This is our first crack at loading a large chunk of data into the ArcFM database so any suggestions are welcome! Thanks!

       

      Frank Bailey

        • Re: Loading Large Data Sets Into ArcFM Model
          David Miller

          A few questions first:

           

          1.  Do you have a lot of versions in your database already?

          2.  What are you loading the data in to?  Existing feature classes or brand new ones?

          3.  Are the target feature classes already a part of the geometric network and configured with the ArcFM model names and such?

          4.  What mechanism are you using to load the data in to the database?  ArcMap or ArcCatalog?

          5.  How big is the file geodatabase with the secondary features?

            • Re: Loading Large Data Sets Into ArcFM Model
              Frank Bailey

              1. No. In the test environment we've taken the database to state 0 and done a full compress. The plan is to do the same in the production environment once we've fully documented the loading procedure.

              2. We are loading the data into existing but empty feature classes and two related tables.

              3. Yes.

              4. We have been using the "Object Loader" tool within ArcMap. Currently the data is still versioned and we've been testing out creating a session and running Object Loader to import secondary conductor and service points feeder by feeder. So far it seems to work well and pulls the related table records for conductor info as well. However it's pretty slow.

              5. The file geodatabase is about 670MB but it also includes a full replica of the entire electric model. The secondary data in it is a subset of the total. I can extract to its own gdb if knowing the exact size would be useful.

               

              Thanks for getting back to me Dave!


              Frank

                • Re: Loading Large Data Sets Into ArcFM Model
                  David Miller

                  I've never used the Object Loader tool in ArcMap, so I had to read up on it...

                   

                  You can't unversion the database with what you're doing.  Object Loader is running in an edit session and you can only edit a geometric network that is versioned.  One option that will save you a potentially long step in reconciling/posting is to set the DEFAULT version to public and run your process right against it.  I would strongly recommend you have a full back up of your database before you do this though (and thank god you have a test environment to try this all out on first).

                   

                  The slowness you're experiencing is probably two things: the Object Loader tool is building out the network connectivity as it goes and the Feeder Manager AUs are firing.  If you want the secondary network to actually inherit the feederIDs, trace weights, etc, you need the AUs to fire off.  If you turn off all the AUs, then you will have to run Trace a Feeder on every feeder once the data load is done to populate all that stuff.  But it might be quicker to turn off the AUs, load the data, compress the database, turn the AUs back on, and the run Trace All Feeders...  It really depends on when and how often the Feeder Manager AUs fire off during the Object Loader process.  Since you have the test environment, you could always try it out with a couple feeders and see how it goes.

                   

                  I was curious on the size of the database just to get a rough idea of how much data you're trying to load.  It doesn't sound like its too big, but you might just have to slog through the slowness with all the connectivity/validation/AUs that are firing as you build stuff out.

                    • Re: Loading Large Data Sets Into ArcFM Model
                      Frank Bailey

                      Awesome. David, thanks for your insight on this. I really appreciate it.

                       

                      I think you're correct, it might be a quicker process to load the data with AU's disabled and then run the trace all feeders tool afterward. I think we'll give that a try on the test environment and see if it speeds things up.

                       

                      I also like the idea of going right against Default as opposed to creating a session and editing there. Once we've nailed down all the steps to load the data in our throw away edit session, I think we'll try that in test as well.

                       

                      And yes, backups are totally key. Before we run this in Prod we'll have our DBA take a backup of the database. We also have nightly backups running currently in the prod environment that get archived offsite. Belt and suspenders!