From 96580978fd046d82b2545fdac8b72ea52bd4956c Mon Sep 17 00:00:00 2001
From: Dilawar Singh <dilawars@ncbs.res.in>
Date: Mon, 7 Nov 2016 12:30:05 +0530
Subject: [PATCH] Squashed 'moose-core/' changes from a15a538..c2c32b9

c2c32b9 Before tagging runing with clang.
1cc19f1 updated runtime to annotation field in moose
3ad9558 runTime and solver type is read from xml file to moose, cleaning in names
5de1f05 function output to pool object set's "setN" as input is "nOut"
ae24a3d Merge branch 'master' of https://github.com/BhallaLab/moose-core
7b43f70 Fixed Function unzombification for increment messages. Added update for time in Zombie Functions.
f4310e6 for now function output connected to pool via setConc message, but there is "increment" which depend on the function input this will be fixed later
f286dca removed writekkit.py duplicate file
6d299eb replace all the obj under /kinetics or /kinetics/group which is required by genesis format
a6dad52 Merge branch 'master' of https://github.com/BhallaLab/moose-core
0ddbd21 Gsolve updates to monitor number of reac events. Stoich updates to avoid crashes when handling edge cases without objects in model.
2ab410d function can be connected to pool,bufpool and reaction
086ef95 using networkX spring_layout rather than grapviz dot layout
0000d51 Ids begining with numbers needed to processed with dash for sbml compartability,No substrate, no product enzymes are not written down, function are now placed under compartment and for backwarcs compartability with genesis they are placed under BufPool
f0f1444 Merge branch 'master' of github.com:BhallaLab/moose-core
e0c6c1a Allow failure with clang. I could not reproduce the error on my local machine.
9e4db61 First skeleton changes to the Gsolver to track the number of reaction firings for each reaction. Also some fixes to the SeqSynHandler.
cff32e4 Fixes to include path of muparsers. This is passing on openSUSE.
2117969 Added missing SeqSynHandler source file to CMake.
58ddb2b Added SeqSynHandler class to deal with single-neuron sequence recognition models
1e42ece Merge branch 'master' of https://github.com/BhallaLab/moose-core
659a908 Fixes for Ksolve accuracy, and some ReadKkit edge cases
7c3e72e small correction
0d7f908 clean up for Annotation, plots, reactions, enz
a2f6a95 Merge branch 'master' of http://github.com/BhallaLab/moose-core
fb70792 empty reaction are not written,units corrections, plot paths are written from compartment level unlike before from model level
a1593e9 Disabled pip install python-sbml with clang.
13acd0f Disabled SBML test for time being till issue #145 and #152 are resolved.
f3b42fd Merge branch 'master' of https://github.com/BhallaLab/moose-core
b16311e Fixed unicode conversion for docstrings Put in sorting of fields in showfields.
793ae62 Added pygraphviz to pip3 install list. pygraphviz is a dependency for SBML.
396e607 Both python2-networkx and python3-networkx can't be installed simultaneously. Fixes that on travis.
ac02cea Added python-networkx to python3 dependency. Test run on travis.
dd4fc4e Removed option WITH_SBML deprecated warning [skip ci]
c6c60e6 Removed c++ sbml support code. Test run on travis.
c73e4cd Removed C++ sbml code.
68ea7c2 Changed SBML script to make them compatible with python3.
8614680 Fixed Leakage class
8b15511 Merge branch 'master' of https://github.com/BhallaLab/moose-core
24a01e8 Added ICa for NMDA channel to known plot variables for rdesigneur
52ae361 Fix to issue #147.
c251c0d reaction rateLaw and enz's km is corrected
a03cb6a initial concentration is set if hasOnlySubstanceUnit=false with correct units
7018230 Fixes for SynChan modulation, added a couple of synaptic channel prototypes
66bfcff Upi fixed the scaling factor
b6532e8 Merge branch 'master' of http://github.com/BhallaLab/moose-core
b7a3ad4 MMEnz is created
37c5312 Deprecating sbml related code on travis. Don't download SBML debian package and install.
791762a plots populated to moose if defined in the sBML file
b05a983 Adding python-networkx as dependency in travis.     python-networkx seems to be required to run SBML related code (needs confirmation?)     python-networkx is a dependency for moose-gui.
76c5381 print statement needed paranthese
d52bd19 fixed a work around for small bug in the draw_grapviz function in networkx-1.11 which no longer import toplevel namespace, readSBML:cleanup
2fd15b6 updated to read SBML file into moose
422bbc0 One of the testcase where model loaded in moose are written to SBML file
dbc5412 Removing sbml build from cmake. SBML test is failing.
8876581 Merge branch 'master' of https://github.com/BhallaLab/moose-core
66f55c2 A check made to see if python-libsbml modules is installed if yes then reading and writing to SBML is done, else a warning message to install module is done
073ef60 Merge pull request #144 from dilawar/master
94b686a Fix to BhallaLab/moose-core#143.
b460557 Removing c++ depenedency of libsbml and its files
fea73c3 Merge pull request #141 from subhacom/master
7644e69 Fixed adding spike generators in ReadCell dotp.
3928fae Merge remote-tracking branch 'upstream/master'
5822181 Merge branch 'master' of https://github.com/subhacom/moose-core
79d5a5c Updated generated neuroml2 module and test file.
5a4197c Merge branch 'master' of https://github.com/BhallaLab/moose-core
06aea5b SBML is removed from the make file
2a5bc46 Merge branch 'master' of https://github.com/BhallaLab/moose-core
1a55310 Fixes for enzyme rate scaling when reading kkit files
8f44029 Merge pull request #140 from dilawar/master
f9e8720 Merge branch 'master' of github.com:BhallaLab/moose-core
1aee89a When WITH_SBML=OFF is given from command line, do not compile its support.
c959613 Merge branch 'master' of https://github.com/BhallaLab/moose-core
00a9171 Fixed access functions for kinetic objects and solvers
d70da97 some more colors to ignore which are light
4d89fd6 color if not defined in genesis then random color from pyqt Qcolor matplotlib files are taken as saved, matplotlib
5ecd2b1 all the co-ordinate are calculated and pass from kkit to this file and only we run in commandline then auto-coordinates are calculated with appropriate zoom factor
27bfae9 using graphviz_layout instead of pygraphviz_layout to get the position for auto-coordinates for layout the moodels
1f93ddb Removing pygraphviz lib dependency and using pygraphviz_layout to get the position of spring_layout
2c421a0 readSBML gets x and y co-ordinates from the file and loads it
eba7d75 cleanup in layout co-ordinates to save the model from moose to genesis
17863ee Layout co-ordinates from GUI is saved into SBML file, 2.Automatic Layout done using pyqgraphviz when model saved from command line to SBML file. InitialConc are saved with mole as default units
395daa7 Fix for muParser for C++ 10
ef2196e Adding quot operator in muparser.
6deb249 Merge branch 'master' of github.com:BhallaLab/moose-core
a6f8bab Updates to rdesigneur
b922890 Fix dependency on hdf5
6f7e46d cleanup
b943072 added networkx's Graph for layout the co-ordinates when model is save to SBML at commandline, if model saved at GUI level then pyqt's scenepos which is used for layout in GUI, those co-ordinates are used
8b1a05d Co-ordinates,group,textcolor and background color are written to SBML file in Annotation field
2718470 auto co-ordinates if one saves model from commandline and also cleanup with co-ordinates for genesis
41b6bfb color index is cleanedup according to genesis requirement and co-ordinates calculation is removed, since I am sending ScenePos
ce6b740 cleanup in writing genesis file with scene co-ordinates and few other clean up
8f2a3e1 Added compiler macros for AppleClang.
076ecf6 libsbml-5.9.0 is not available on brew. Build breaks with sbml 5.11.0
2976483 Fix apple build. adding -stdlib=libc++ to cmake.
530d69b Merge branch 'master' of https://github.com/BhallaLab/moose-core
d8cd4bd Updated python setup script to include streaming classes
5d6ee59 Actual fix for issue #110
5df0a7f Merge branch 'master' of https://github.com/BhallaLab/moose-core
39145e3 Merge branch 'master' of https://github.com/BhallaLab/moose-core
2634422 Merge branch 'master' of https://github.com/subhacom/moose-core
ddbdd8c Removed mgui and moogli from Python setup.py
c2fa2cf Added hsolve test comparing to NEURON

git-subtree-dir: moose-core
git-subtree-split: c2c32b97caecd422d6bc9398094073ce9f1b7492
---
 .travis.yml                                   |   33 +-
 CMakeLists.txt                                |   67 -
 CheckCXXCompiler.cmake                        |    7 +-
 Makefile                                      |   14 +-
 MooseTests.cmake                              |   16 +-
 basecode/Makefile                             |    2 +-
 basecode/main.cpp                             |    2 +
 basecode/testAsync.cpp                        |    1 +
 biophysics/Leakage.cpp                        |    9 +
 biophysics/Leakage.h                          |    1 +
 biophysics/ReadCell.cpp                       |    2 +-
 biophysics/SynChan.cpp                        |    6 +
 external/muparser/include/muParser.h          |    1 +
 external/muparser/src/muParser.cpp            |    2 +
 external/muparser/src/muParserTokenReader.cpp |    6 +-
 kinetics/Enz.cpp                              |   29 +-
 kinetics/Enz.h                                |    5 +
 kinetics/ReadKkit.cpp                         |   19 +-
 ksolve/Gsolve.cpp                             |   27 +
 ksolve/Gsolve.h                               |    1 +
 ksolve/GssaVoxelPools.cpp                     |    9 +
 ksolve/GssaVoxelPools.h                       |    5 +
 ksolve/Ksolve.cpp                             |   11 +-
 ksolve/OdeSystem.h                            |    2 +-
 ksolve/Stoich.cpp                             |   33 +-
 ksolve/VoxelPools.cpp                         |    2 +-
 ksolve/ZombieEnz.cpp                          |   10 +
 ksolve/ZombieFunction.cpp                     |    4 +-
 pymoose/moosemodule.cpp                       |   10 +-
 python/moose/SBML/__init__.py                 |    4 +-
 python/moose/SBML/readSBML.py                 |  679 +-
 python/moose/SBML/writeSBML.py                |  747 +-
 python/moose/genesis/_main.py                 | 1226 +--
 python/moose/moose.py                         |   24 +-
 python/moose/neuroml2/generated_neuroml.py    | 8259 +++++++----------
 python/moose/neuroml2/generated_neuromlsub.py |  373 +-
 .../neuroml2/test_files/NML2_FullCell.nml     |   24 +-
 .../neuroml2/test_files/SimpleIonChannel.xml  |   37 +-
 python/moose/writekkit.py                     |  506 -
 python/rdesigneur/rdesigneur.py               |   47 +-
 python/rdesigneur/rdesigneurProtos.py         |   65 +-
 python/rdesigneur/rmoogli.py                  |   36 +-
 sbml/CMakeLists.txt                           |    9 -
 sbml/Makefile                                 |   31 -
 sbml/MooseSbmlReader.cpp                      | 1468 ---
 sbml/MooseSbmlReader.h                        |   86 -
 sbml/MooseSbmlWriter.cpp                      | 1137 ---
 sbml/MooseSbmlWriter.h                        |   53 -
 scheduling/Clock.cpp                          |    7 +-
 shell/Makefile                                |    3 +-
 shell/Shell.cpp                               |   10 +-
 shell/Shell.h                                 |    6 +-
 synapse/CMakeLists.txt                        |    5 +-
 ...aupnerBrunel2012CaPlasticitySynHandler.cpp |    2 +-
 ...GraupnerBrunel2012CaPlasticitySynHandler.h |    2 +
 synapse/Makefile                              |   14 +-
 synapse/RollingMatrix.cpp                     |  109 +
 synapse/RollingMatrix.h                       |   61 +
 synapse/STDPSynHandler.cpp                    |    2 +-
 synapse/STDPSynHandler.h                      |    2 +
 synapse/SeqSynHandler.cpp                     |  437 +
 synapse/SeqSynHandler.h                       |  117 +
 synapse/SimpleSynHandler.cpp                  |    1 +
 synapse/SimpleSynHandler.h                    |    2 +
 synapse/SynEvent.h                            |   79 +
 synapse/testSynapse.cpp                       |  132 +-
 tests/python/chem_models/acc27.g              |  347 +
 tests/python/test_sbml.py                     |   12 +-
 tests/python/test_sbml_support.py             |   49 +-
 69 files changed, 6932 insertions(+), 9614 deletions(-)
 delete mode 100644 python/moose/writekkit.py
 delete mode 100644 sbml/CMakeLists.txt
 delete mode 100644 sbml/Makefile
 delete mode 100644 sbml/MooseSbmlReader.cpp
 delete mode 100644 sbml/MooseSbmlReader.h
 delete mode 100644 sbml/MooseSbmlWriter.cpp
 delete mode 100644 sbml/MooseSbmlWriter.h
 create mode 100644 synapse/RollingMatrix.cpp
 create mode 100644 synapse/RollingMatrix.h
 create mode 100644 synapse/SeqSynHandler.cpp
 create mode 100644 synapse/SeqSynHandler.h
 create mode 100644 synapse/SynEvent.h
 create mode 100644 tests/python/chem_models/acc27.g

diff --git a/.travis.yml b/.travis.yml
index 8a896dd7..473c14db 100644
--- a/.travis.yml
+++ b/.travis.yml
@@ -20,40 +20,43 @@ env:
     - CTEST_MODEL=Nightly
 cache: apt
 
-matrix:
-    allow_failures:
+#matrix:
+#    allow_failures:
+#        - clang
 
 before_script:
     - sudo apt-get install -qq libxml2-dev libbz2-dev
     - sudo apt-get install -qq libhdf5-serial-dev
     - sudo apt-get install -qq make cmake
-    - sudo apt-get install -qq  python-numpy python-matplotlib
-    - sudo apt-get install -qq  python3-numpy python3-matplotlib python3-dev
+    - sudo apt-get install -qq python-numpy python-matplotlib python-networkx
+    - sudo apt-get install -qq python3-numpy python3-matplotlib python3-dev 
     - sudo apt-get install -qq libboost-all-dev
     - sudo apt-get install -qq libgsl0-dev
-
-before_install:
-    - sbmlurl="http://sourceforge.net/projects/sbml/files/libsbml/5.9.0/stable/Linux/64-bit/libSBML-5.9.0-Linux-x64.deb"
-    - wget "$sbmlurl" -O libsbml.deb && sudo dpkg -i libsbml.deb 
-    - sudo apt-get install -f
+    - sudo apt-get install -qq python-pip python3-pip
+    - sudo apt-get install -qq libgraphviz-dev
+    - # sudo pip install python-libsbml
+    - # sudo pip3 install python-libsbml
+    - # sudo pip3 install pygraphviz
 
 install:
     - echo "nothing to do here"
 
 script:
-    - sudo ldconfig /usr/lib64
     - # First test is normal make scripts. (outdated).
     - make 
-    - # Now test the cmake with gsl
+    - ## CMAKE based flow
     - mkdir -p _GSL_BUILD && cd _GSL_BUILD && cmake -DDEBUG=ON -DPYTHON_EXECUTABLE=/usr/bin/python ..
     - make && ctest --output-on-failure
-    - cd .. # Build with python3.
-    - mkdir -p _GSL_BUILD2 && cd _GSL_BUILD2 && cmake -DDEBUG=ON -DPYTHON_EXECUTABLE=/usr/bin/python3 ..
-    - make && ctest --output-on-failure
     - cd .. # Now with boost.
     - mkdir -p _BOOST_BUILD && cd _BOOST_BUILD && cmake -DWITH_BOOST=ON -DDEBUG=ON -DPYTHON_EXECUTABLE=/usr/bin/python ..
     - make && ctest --output-on-failure
-    - cd .. # Now with boost and python3.
+    - cd .. 
+    - echo "Python3 support. Removed python2-networkx and install python3"
+    - sudo apt-get remove -qq python-networkx
+    - sudo apt-get install -qq python3-networkx
+    - mkdir -p _GSL_BUILD2 && cd _GSL_BUILD2 && cmake -DDEBUG=ON -DPYTHON_EXECUTABLE=/usr/bin/python3 ..
+    - make && ctest --output-on-failure
+    - cd .. # Now with BOOST and python3
     - mkdir -p _BOOST_BUILD2 && cd _BOOST_BUILD2 && cmake -DWITH_BOOST=ON -DDEBUG=ON -DPYTHON_EXECUTABLE=/usr/bin/python3 ..
     - make && ctest --output-on-failure
     - cd ..
diff --git a/CMakeLists.txt b/CMakeLists.txt
index ae7a4e03..7d5d25f7 100644
--- a/CMakeLists.txt
+++ b/CMakeLists.txt
@@ -83,19 +83,11 @@ option(WITH_MPI  "Enable Openmpi support" OFF)
 option(WITH_BOOST "Use boost library instead of GSL" OFF)
 option(WITH_GSL  "Use gsl-library. Alternative is WITH_BOOST" ON)
 
-# If SBML is available, it will automaticall enable the support. If ON, then
-# libSBML must be present.
-option(WITH_SBML  "Enable SBML support. Automatically detected." OFF)
-
 # If GSL_STATIC_HOME is set, we use it to search for static gsl libs.
 option(GSL_STATIC_HOME 
     "Path prefix where static gsl library can be found e.g. /opt/sw/gsl116 " 
     OFF
     )
-option(SBML_STATIC_HOME 
-    "Path prefix where static sbml library can be found e.g. /opt/sbml/"
-    OFF
-    )
 option(HDF5_STATIC_HOME
     "Path prefix where static hdf5 library can be found e.g /opt/sw/hdf5 "
     OFF
@@ -165,57 +157,6 @@ if(WITH_BOOST)
     set(WITH_GSL OFF)
 endif(WITH_BOOST)
 
-# If this variable is used while packaging; we build this project using another
-# cmake script and pass the location of SBML static library using this variable.
-# Otherwise, this is useless. FIXME: This is not a clean solution.
-if(SBML_STATIC_HOME)
-    include_directories(${SBML_STATIC_HOME}/include)
-    find_library(LIBSBML_LIBRARIES
-        NAMES libsbml-static.a libsbml.a
-        PATHS "${SBML_STATIC_HOME}/lib" "${SBML_STATIC_HOME}/lib64"
-        NO_DEFAULT_PATH
-        )
-    message(STATUS "- Using static version of LIBSBML_LIBRARIES: ${LIBSBML_LIBRARIES}")
-    if(LIBSBML_LIBRARIES)
-        message(STATUS "- Successfully located SBML static library: ${LIBSBML_LIBRARIES}")
-    ELSE()
-        message(FATAL_ERROR 
-            "Can't find static libsbml libraries at path ${SBML_STATIC_HOME}"
-            )
-    endif()
-    set(LIBSBML_FOUND ON)
-else(SBML_STATIC_HOME)
-    pkg_check_modules(LIBSBML libsbml)
-    if(NOT LIBSBML_FOUND)
-        message(STATUS "pkg-config could not find sbml. Fallback to default")
-        find_package(LIBSBML)
-    endif()
-endif(SBML_STATIC_HOME)
-
-if(LIBSBML_FOUND)
-    message(STATUS "LIBSBML found ${LIBSBML_LIBRARIES}")
-    include_directories(${LIBSBML_INCLUDE_DIRS})
-    pkg_check_modules(LibXML2 libxml-2.0)
-    if (!LibXML2_FOUND)
-        find_package(LibXML2 REQUIRED)
-    endif()
-    include_directories(${LibXML2_INCLUDE_DIRS})
-else()
-    message(
-        "======================================================================\n"
-        "libsbml NOT found. \n\n"
-        "If you want to compile with SBML support, download and install \n"
-        "libsbml-5.9.0 from: \n"
-        "http://sourceforge.net/projects/sbml/files/libsbml/5.9.0/stable/ and\n"
-        "rerun cmake.\n\n"
-        "If you don't want SBML support then continue with `make`.\n\n"
-        "If you install libsbml to non-standard place, let the cmake know by\n"
-        "exporting environment variable SBML_DIR to the location.\n"
-        "=====================================================================\n"
-        )
-    SET(WITH_SBML OFF)
-endif()
-
 include_directories(msg basecode)
 
 set_target_properties(libmoose PROPERTIES COMPILE_DEFINITIONS  "MOOSE_LIB")
@@ -366,7 +307,6 @@ add_subdirectory(biophysics)
 add_subdirectory(builtins)
 add_subdirectory(utility)
 add_subdirectory(mesh)
-add_subdirectory(sbml)
 add_subdirectory(mpi)
 add_subdirectory(signeur)
 add_subdirectory(ksolve)
@@ -400,13 +340,6 @@ elseif(HDF5_FOUND)
     list(APPEND SYSTEM_SHARED_LIBS ${HDF5_LIBRARIES})
 endif()
 
-LIST(APPEND STATIC_LIBRARIES moose_sbml)
-if(SBML_STATIC_HOME)
-    list(APPEND STATIC_LIBRARIES ${LIBSBML_LIBRARIES})
-elseif(LIBSBML_FOUND)
-    list(APPEND SYSTEM_SHARED_LIBS ${LIBSBML_LIBRARIES})
-endif()
-
 if(WITH_GSL)
     if(GSL_STATIC_HOME)
         message(STATUS "Using STATIC gsl libraries: ${GSL_LIBRARIES}")
diff --git a/CheckCXXCompiler.cmake b/CheckCXXCompiler.cmake
index 7fddd1d2..15a4d730 100644
--- a/CheckCXXCompiler.cmake
+++ b/CheckCXXCompiler.cmake
@@ -6,7 +6,6 @@ CHECK_CXX_COMPILER_FLAG( "-std=c++0x" COMPILER_SUPPORTS_CXX0X )
 CHECK_CXX_COMPILER_FLAG( "-Wno-strict-aliasing" COMPILER_WARNS_STRICT_ALIASING )
 
 
-
 # Turn warning to error: Not all of the options may be supported on all
 # versions of compilers. be careful here.
 add_definitions(-Wall
@@ -32,11 +31,17 @@ if(COMPILER_SUPPORTS_CXX11)
     message(STATUS "Your compiler supports c++11 features. Enabling it")
     set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} -std=c++11")
     add_definitions( -DENABLE_CPP11 )
+    if(APPLE)
+        set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} -stdlib=libc++" )
+    endif(APPLE)
 elseif(COMPILER_SUPPORTS_CXX0X)
     message(STATUS "Your compiler supports c++0x features. Enabling it")
     set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} -std=c++0x")
     add_definitions( -DENABLE_CXX11 )
     add_definitions( -DBOOST_NO_CXX11_SCOPED_ENUMS -DBOOST_NO_SCOPED_ENUMS )
+    if(APPLE)
+        set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} -stdlib=libc++" )
+    endif(APPLE)
 else()
     add_definitions( -DBOOST_NO_CXX11_SCOPED_ENUMS -DBOOST_NO_SCOPED_ENUMS )
     message(STATUS "The compiler ${CMAKE_CXX_COMPILER} has no C++11 support.")
diff --git a/Makefile b/Makefile
index 824a4298..65b7c9b3 100644
--- a/Makefile
+++ b/Makefile
@@ -60,7 +60,7 @@
 
 # Default values for flags. The operator ?= assigns the given value only if the
 # variable is not already defined.
-USE_SBML?=0
+#USE_SBML?=0
 USE_HDF5?=1
 USE_CUDA?=0
 USE_NEUROKIT?=0
@@ -230,16 +230,6 @@ else
 LIBS+= -lm
 endif
 
-#harsha
-# To use SBML, pass USE_SBML=1 in make command line
-ifeq ($(USE_SBML),1)
-LIBS+= -lsbml
-CXXFLAGS+=-DUSE_SBML
-LDFLAGS += -L/usr/lib64 -Wl,--rpath='/usr/lib64'
-SBML_DIR = sbml
-SBML_LIB = sbml/_sbml.o
-endif
-
 #Saeed
 # To use CUDA, pass USE_CUDA=1 in make command line
 ifeq ($(USE_CUDA),1)
@@ -371,7 +361,7 @@ export CXXFLAGS
 export LD
 export LIBS
 export USE_GSL
-export USE_SBML
+#export USE_SBML
 
 all: moose pymoose
 
diff --git a/MooseTests.cmake b/MooseTests.cmake
index 25fdf23b..77e9fc86 100644
--- a/MooseTests.cmake
+++ b/MooseTests.cmake
@@ -79,14 +79,14 @@ set_tests_properties(pymoose-ksolve-test
     PROPERTIES ENVIRONMENT "PYTHONPATH=${PROJECT_BINARY_DIR}/python"
     )
 
-# Test basic SBML support.
-ADD_TEST(NAME pymoose-test-basic-sbml-support
-    COMMAND ${TEST_COMMAND}
-    ${PROJECT_SOURCE_DIR}/tests/python/test_sbml_support.py
-    )
-set_tests_properties(pymoose-test-basic-sbml-support 
-    PROPERTIES ENVIRONMENT "PYTHONPATH=${PROJECT_BINARY_DIR}/python"
-    )
+## Test basic SBML support.
+#ADD_TEST(NAME pymoose-test-basic-sbml-support
+    #COMMAND ${TEST_COMMAND}
+    #${PROJECT_SOURCE_DIR}/tests/python/test_sbml_support.py
+    #)
+#set_tests_properties(pymoose-test-basic-sbml-support 
+    #PROPERTIES ENVIRONMENT "PYTHONPATH=${PROJECT_BINARY_DIR}/python"
+    #)
 
 # Test basic SBML support.
 ADD_TEST(NAME pymoose-test-rng
diff --git a/basecode/Makefile b/basecode/Makefile
index c4aa0eee..0639a32c 100644
--- a/basecode/Makefile
+++ b/basecode/Makefile
@@ -79,7 +79,7 @@ default: $(TARGET)
 
 $(OBJ)	: $(HEADERS) ../shell/Shell.h
 Element.o:	FuncOrder.h
-testAsync.o:	SparseMatrix.h SetGet.h ../scheduling/Clock.h ../biophysics/IntFire.h ../synapse/SynHandlerBase.h ../synapse/SimpleSynHandler.h ../synapse/Synapse.h ../randnum/RNG.h
+testAsync.o:	SparseMatrix.h SetGet.h ../scheduling/Clock.h ../biophysics/IntFire.h ../synapse/SynEvent.h ../synapse/SynHandlerBase.h ../synapse/SimpleSynHandler.h ../synapse/Synapse.h ../randnum/RNG.h
 SparseMsg.o:	SparseMatrix.h
 SetGet.o:	SetGet.h ../shell/Neutral.h
 HopFunc.o:	HopFunc.h ../mpi/PostMaster.h
diff --git a/basecode/main.cpp b/basecode/main.cpp
index a6563c09..5d234e2f 100644
--- a/basecode/main.cpp
+++ b/basecode/main.cpp
@@ -57,6 +57,7 @@ extern void testShell();
 extern void testScheduling();
 extern void testSchedulingProcess();
 extern void testBuiltins();
+extern void testSynapse();
 extern void testBuiltinsProcess();
 
 extern void testMpiScheduling();
@@ -342,6 +343,7 @@ void nonMpiTests( Shell* s )
         MOOSE_TEST("testHsolve", testHSolve());
         //MOOSE_TEST("testGeom", testGeom());
         MOOSE_TEST("testMesh", testMesh());
+        MOOSE_TEST("testSynapse", testSynapse());
         MOOSE_TEST( "testSigneur", testSigNeur());
 #ifdef USE_SMOLDYN
         //MOOSE_TEST(testSmoldyn());
diff --git a/basecode/testAsync.cpp b/basecode/testAsync.cpp
index c1624dc8..e96557f8 100644
--- a/basecode/testAsync.cpp
+++ b/basecode/testAsync.cpp
@@ -18,6 +18,7 @@
 #include <queue>
 #include "../biophysics/IntFire.h"
 #include "../synapse/Synapse.h"
+#include "../synapse/SynEvent.h"
 #include "../synapse/SynHandlerBase.h"
 #include "../synapse/SimpleSynHandler.h"
 #include "SparseMatrix.h"
diff --git a/biophysics/Leakage.cpp b/biophysics/Leakage.cpp
index d701a0f1..a3d196ba 100644
--- a/biophysics/Leakage.cpp
+++ b/biophysics/Leakage.cpp
@@ -88,14 +88,23 @@ Leakage::~Leakage()
 
 void Leakage::vProcess( const Eref & e, ProcPtr p )
 {
+	ChanCommon::vSetGk( e, this->vGetGbar( e ) * this->vGetModulation( e ));
+	updateIk();
     sendProcessMsgs(e, p);
 }
 
 void Leakage::vReinit( const Eref & e, ProcPtr p )
 {
+	ChanCommon::vSetGk( e, this->vGetGbar( e ) * this->vGetModulation( e ));
+	updateIk();
     sendReinitMsgs(e, p);
 }
 
+void Leakage::vSetGbar( const Eref& e, double gbar )
+{
+		ChanCommon::vSetGk( e, gbar * this->vGetModulation( e ) );
+		ChanCommon::vSetGbar( e, gbar );
+}
 
 // 
 // Leakage.cpp ends here
diff --git a/biophysics/Leakage.h b/biophysics/Leakage.h
index 21815a7e..82f5a990 100644
--- a/biophysics/Leakage.h
+++ b/biophysics/Leakage.h
@@ -56,6 +56,7 @@ class Leakage: public ChanCommon
     ~Leakage();
     void vProcess( const Eref & e, ProcPtr p );
     void vReinit( const Eref & e, ProcPtr p );
+    void vSetGbar( const Eref & e, double gbar );
 
     static const Cinfo * initCinfo();
 };
diff --git a/biophysics/ReadCell.cpp b/biophysics/ReadCell.cpp
index 12578943..6d87a8e7 100644
--- a/biophysics/ReadCell.cpp
+++ b/biophysics/ReadCell.cpp
@@ -823,7 +823,7 @@ bool ReadCell::addSpikeGen(
 		shell_->doAddMsg(
 			"Single",
 			compt,
-			"VmSrc",
+			"VmOut",
 			chan,
 			"Vm"
 		);
diff --git a/biophysics/SynChan.cpp b/biophysics/SynChan.cpp
index 6714ce1c..fc49a64d 100644
--- a/biophysics/SynChan.cpp
+++ b/biophysics/SynChan.cpp
@@ -199,10 +199,16 @@ void SynChan::normalizeGbar()
 /// Update alpha function terms for synaptic channel.
 double SynChan::calcGk()
 {
+		/*
 	X_ = getModulation() * activation_ * xconst1_ + X_ * xconst2_;
 	Y_ = X_ * yconst1_ + Y_ * yconst2_;
 	activation_ = 0.0;
 	return Y_ * norm_;
+	*/
+	X_ = activation_ * xconst1_ + X_ * xconst2_;
+	Y_ = X_ * yconst1_ + Y_ * yconst2_;
+	activation_ = 0.0;
+	return Y_ * norm_ * getModulation();
 }
 
 void SynChan::vProcess( const Eref& e, ProcPtr info )
diff --git a/external/muparser/include/muParser.h b/external/muparser/include/muParser.h
index efe8ae83..fb0de64c 100644
--- a/external/muparser/include/muParser.h
+++ b/external/muparser/include/muParser.h
@@ -101,6 +101,7 @@ namespace mu
     static value_type  Rint(value_type);
     static value_type  Sign(value_type);
     static value_type  Fmod(value_type, value_type);
+    static value_type  Quot(value_type, value_type);
 
     // Random between a and b, with fixed seed.
     static value_type  Rand2(value_type, value_type, value_type); 
diff --git a/external/muparser/src/muParser.cpp b/external/muparser/src/muParser.cpp
index 14977f7b..925e8bab 100644
--- a/external/muparser/src/muParser.cpp
+++ b/external/muparser/src/muParser.cpp
@@ -110,6 +110,7 @@ namespace mu
   value_type Parser::Exp(value_type v)  { return MathImpl<value_type>::Exp(v);  }
   value_type Parser::Abs(value_type v)  { return MathImpl<value_type>::Abs(v);  }
   value_type Parser::Fmod(value_type v1, value_type v2) { return fmod(v1, v2); }
+  value_type Parser::Quot(value_type v1, value_type v2) { return (int)(v1 / v2); }
 
   // If no seed is given, 
   value_type Parser::Rand( value_type seed ) 
@@ -336,6 +337,7 @@ namespace mu
       DefineFun(_T("avg"), Avg);
       DefineFun(_T("min"), Min);
       DefineFun(_T("max"), Max);
+      DefineFun(_T("quot"), Quot);
     }
   }
 
diff --git a/external/muparser/src/muParserTokenReader.cpp b/external/muparser/src/muParserTokenReader.cpp
index e237d514..e6252611 100644
--- a/external/muparser/src/muParserTokenReader.cpp
+++ b/external/muparser/src/muParserTokenReader.cpp
@@ -154,8 +154,10 @@ namespace mu
 #else
     ParserTokenReader* ptr = new ParserTokenReader(*this);
     ptr->SetParent( a_pParent );
-    delete ptr;
-    return NULL;
+	return ptr;
+	// Upi Bhalla 13 June 2016: I think the original two lines below are wrong
+    // delete ptr;
+    // return NULL;
 #endif
 
   }
diff --git a/kinetics/Enz.cpp b/kinetics/Enz.cpp
index b320e8c8..78555394 100644
--- a/kinetics/Enz.cpp
+++ b/kinetics/Enz.cpp
@@ -23,15 +23,27 @@ const Cinfo* Enz::initCinfo()
 		//////////////////////////////////////////////////////////////
 		// MsgDest Definitions
 		//////////////////////////////////////////////////////////////
+		static DestFinfo setKmK1Dest( "setKmK1",
+			"Low-level function used when you wish to explicitly set "
+			"Km and k1, without doing any of the volume calculations."
+			"Needed by ReadKkit and other situations where the numbers "
+			"must be set before all the messaging is in place."
+			"Not relevant for zombie enzymes.",
+			new OpFunc2< Enz, double, double >( &Enz::setKmK1 )
+		);
 		//////////////////////////////////////////////////////////////
 		// Shared Msg Definitions
 		//////////////////////////////////////////////////////////////
 	static Dinfo< Enz > dinfo;
+	static Finfo* enzFinfos[] = {
+		&setKmK1Dest,	// DestFinfo
+	};
+
 	static Cinfo enzCinfo (
 		"Enz",
 		CplxEnzBase::initCinfo(),
-		0,
-		0,
+		enzFinfos,
+		sizeof( enzFinfos ) / sizeof ( Finfo* ),
 		&dinfo
 	);
 
@@ -61,7 +73,7 @@ static const SrcFinfo2< double, double >* cplxOut =
 // Enz internal functions
 //////////////////////////////////////////////////////////////
 Enz::Enz( )
-	: k1_( 0.1 ), k2_( 0.4 ), k3_( 0.1 )
+	: Km_(5.0e-3), k1_( 0.1 ), k2_( 0.4 ), k3_( 0.1 )
 {
 	;
 }
@@ -73,6 +85,12 @@ Enz::~Enz()
 // MsgDest Definitions
 //////////////////////////////////////////////////////////////
 
+void Enz::setKmK1( double Km, double k1 )
+{
+	r1_ = k1_ = k1;
+	Km_ = Km;
+}
+
 void Enz::vSub( double n )
 {
 	r1_ *= n;
@@ -118,13 +136,14 @@ void Enz::vRemesh( const Eref& e )
 void Enz::vSetK1( const Eref& e, double v )
 {
 	r1_ = k1_ = v;
-	double volScale = 
-		convertConcToNumRateUsingMesh( e, subOut, 1 );
+	double volScale = convertConcToNumRateUsingMesh( e, subOut, 1 );
 	Km_ = ( k2_ + k3_ ) / ( k1_ * volScale );
 }
 
 double Enz::vGetK1( const Eref& e ) const
 {
+	Enz* temp = const_cast< Enz* >( this );
+	temp->vSetKm( e, Km_ );
 	return k1_;
 }
 
diff --git a/kinetics/Enz.h b/kinetics/Enz.h
index 9ab40988..e67aac2e 100644
--- a/kinetics/Enz.h
+++ b/kinetics/Enz.h
@@ -38,6 +38,11 @@ class Enz: public CplxEnzBase
 		void vSetConcK1( const Eref& e, double v );
 		double vGetConcK1( const Eref& e ) const;
 
+		//////////////////////////////////////////////////////////////////
+		// Dest funcs, not virtual
+		//////////////////////////////////////////////////////////////////
+		void setKmK1( double Km, double k1 );
+
 		//////////////////////////////////////////////////////////////////
 		// Dest funcs, all virtual
 		//////////////////////////////////////////////////////////////////
diff --git a/kinetics/ReadKkit.cpp b/kinetics/ReadKkit.cpp
index 00897959..1c145e98 100644
--- a/kinetics/ReadKkit.cpp
+++ b/kinetics/ReadKkit.cpp
@@ -791,7 +791,8 @@ Id findParentComptOfReac( Id reac )
 
 		vector< Id > subVec;
 		reac.element()->getNeighbors( subVec, subFinfo );
-		assert( subVec.size() > 0 );
+		if ( subVec.size() == 0 ) // Dangling reaction
+			return Id();
 		// For now just put the reac in the compt belonging to the 
 		// first substrate
 		return getCompt( subVec[0] );
@@ -808,10 +809,10 @@ void ReadKkit::assignReacCompartments()
 	for ( map< string, Id >::iterator i = reacIds_.begin(); 
 		i != reacIds_.end(); ++i ) {
 		Id compt = findParentComptOfReac( i->second );
-		// if ( moveOntoCompartment_ ) {
+		if ( compt != Id() ) {
 			if ( ! (getCompt( i->second ).id == compt ) )
 				shell_->doMove( i->second, compt );
-		// }
+		}
 	}
 }
 
@@ -893,7 +894,7 @@ Id ReadKkit::buildEnz( const vector< string >& args )
 	// double vol = atof( args[ enzMap_[ "vol" ] ].c_str());
 	bool isMM = atoi( args[ enzMap_[ "usecomplex" ] ].c_str());
 	assert( poolVols_.find( pa ) != poolVols_.end() );
-	// double vol = poolVols_[ pa ];
+	double vol = poolVols_[ pa ];
 	
 	/**
 	 * vsf is vol scale factor, which is what GENESIS stores in 'vol' field
@@ -927,7 +928,12 @@ Id ReadKkit::buildEnz( const vector< string >& args )
 		// to do this assignments in raw #/cell units.
 		Field< double >::set( enz, "k3", k3 );
 		Field< double >::set( enz, "k2", k2 );
-		Field< double >::set( enz, "k1", k1 );
+		// Here we explicitly calculate Km because the substrates are
+		// not set up till later, and without them the volume calculations
+		// are confused.
+		double volScale = lookupVolumeFromMesh(pa.eref());
+		double Km = (k2+k3)/(k1 * KKIT_NA * vol ); // Scaling for uM to mM.
+		SetGet2< double, double >::set( enz, "setKmK1", Km, k1 );
 
 		string cplxName = tail + "_cplx";
 		string cplxPath = enzPath + "/" + cplxName;
@@ -946,6 +952,7 @@ Id ReadKkit::buildEnz( const vector< string >& args )
 			ObjId( cplx, 0 ), "reac" ); 
 		assert( ret != ObjId() );
 
+
 		// cplx()->showFields();
 		// enz()->showFields();
 		// pa()->showFields();
@@ -1404,7 +1411,7 @@ void ReadKkit::addmsg( const vector< string >& args)
 			}
 			vector< Id > enzcplx;
 			i->second.element()->getNeighbors( enzcplx, 
-				i->second.element()->cinfo()->findFinfo( "toCplx" ) );
+				i->second.element()->cinfo()->findFinfo( "cplxOut" ) );
 			assert( enzcplx.size() == 1 );
 			pool = enzcplx[0];
 		}  else {
diff --git a/ksolve/Gsolve.cpp b/ksolve/Gsolve.cpp
index ed8e7682..a1c576ac 100644
--- a/ksolve/Gsolve.cpp
+++ b/ksolve/Gsolve.cpp
@@ -122,6 +122,14 @@ const Cinfo* Gsolve::initCinfo()
 			&Gsolve::setClockedUpdate,
 			&Gsolve::getClockedUpdate
 		);
+		static ReadOnlyLookupValueFinfo< 
+				Gsolve, unsigned int, vector< unsigned int > > numFire(
+			"numFire",
+			"Vector of the number of times each reaction has fired."
+			"Indexed by the voxel number."
+			"Zeroed out at reinit.",
+			&Gsolve::getNumFire
+		);
 
 		///////////////////////////////////////////////////////
 		// DestFinfo definitions
@@ -198,6 +206,7 @@ const Cinfo* Gsolve::initCinfo()
 		// Here we put new fields that were not there in the Ksolve. 
 		&useRandInit,		// Value
 		&useClockedUpdate,	// Value
+		&numFire,			// ReadOnlyLookupValue
 	};
 	
 	static Dinfo< Gsolve > dinfo;
@@ -265,6 +274,10 @@ void Gsolve::setStoich( Id stoich )
 	assert( stoich.element()->cinfo()->isA( "Stoich" ) );
 	stoich_ = stoich;
 	stoichPtr_ = reinterpret_cast< Stoich* >( stoich.eref().data() );
+    if ( stoichPtr_->getNumAllPools() == 0 ) {
+		stoichPtr_ = 0;
+		return;
+	}
 	sys_.stoich = stoichPtr_;
 	sys_.isReady = false;
 	for ( unsigned int i = 0; i < pools_.size(); ++i )
@@ -319,6 +332,16 @@ void Gsolve::setNvec( unsigned int voxel, vector< double > nVec )
 	}
 }
 
+vector< unsigned int > Gsolve::getNumFire( unsigned int voxel) const
+{
+	static vector< unsigned int > dummy;
+	if ( voxel < pools_.size() ) {
+		return const_cast< GssaVoxelPools* >( &( pools_[ voxel ]) )->numFire();
+	}
+	return dummy;
+}
+
+
 bool Gsolve::getRandInit() const
 {
 	return sys_.useRandInit;
@@ -467,6 +490,8 @@ void Gsolve::reinit( const Eref& e, ProcPtr p )
 //////////////////////////////////////////////////////////////
 void Gsolve::initProc( const Eref& e, ProcPtr p )
 {
+	if ( !stoichPtr_ )
+		return;
 	// vector< vector< double > > values( xfer_.size() );
 	for ( unsigned int i = 0; i < xfer_.size(); ++i ) {
 		XferInfo& xf = xfer_[i];
@@ -483,6 +508,8 @@ void Gsolve::initProc( const Eref& e, ProcPtr p )
 
 void Gsolve::initReinit( const Eref& e, ProcPtr p )
 {
+	if ( !stoichPtr_ )
+		return;
 	for ( unsigned int i = 0 ; i < pools_.size(); ++i ) {
 		pools_[i].reinit( &sys_ );
 	}
diff --git a/ksolve/Gsolve.h b/ksolve/Gsolve.h
index 686a091b..0e35eab8 100644
--- a/ksolve/Gsolve.h
+++ b/ksolve/Gsolve.h
@@ -76,6 +76,7 @@ class Gsolve: public ZombiePoolInterface
 		//////////////////////////////////////////////////////////////////
 		unsigned int getPoolIndex( const Eref& e ) const;
 		unsigned int getVoxelIndex( const Eref& e ) const;
+		vector< unsigned int > getNumFire( unsigned int voxel) const;
 
 		/**
 		 * Inherited. Needed for reac-diff calculations so the Gsolve can
diff --git a/ksolve/GssaVoxelPools.cpp b/ksolve/GssaVoxelPools.cpp
index c7eaba66..c4b529ee 100644
--- a/ksolve/GssaVoxelPools.cpp
+++ b/ksolve/GssaVoxelPools.cpp
@@ -125,6 +125,7 @@ void GssaVoxelPools::setNumReac( unsigned int n )
 {
     v_.clear();
     v_.resize( n, 0.0 );
+    numFire_.resize( n, 0 );
 }
 
 /**
@@ -201,6 +202,8 @@ void GssaVoxelPools::advance( const ProcInfo* p, const GssaSystem* g )
 
         double sign = double(v_[rindex] >= 0) - double(0 > v_[rindex] );
         g->transposeN.fireReac( rindex, Svec(), sign );
+		numFire_[rindex]++;
+		
         double r = rng_.uniform();
         while ( r <= 0.0 )
         {
@@ -246,6 +249,12 @@ void GssaVoxelPools::reinit( const GssaSystem* g )
     }
     t_ = 0.0;
     refreshAtot( g );
+	numFire_.assign( v_.size(), 0 );
+}
+
+vector< unsigned int > GssaVoxelPools::numFire() const
+{
+	return numFire_;
 }
 
 /////////////////////////////////////////////////////////////////////////
diff --git a/ksolve/GssaVoxelPools.h b/ksolve/GssaVoxelPools.h
index ec062b46..fc638d45 100644
--- a/ksolve/GssaVoxelPools.h
+++ b/ksolve/GssaVoxelPools.h
@@ -34,6 +34,8 @@ public:
 
     void advance( const ProcInfo* p, const GssaSystem* g );
 
+	vector< unsigned int > numFire() const;
+
     /**
     * Cleans out all reac rates and recalculates atot. Needed whenever a
     * mol conc changes, or if there is a roundoff error. Returns true
@@ -96,6 +98,9 @@ private:
      */
     vector< double > v_;
     // Possibly we should put independent RNGS, so save one here.
+	
+	// Count how many times each reaction has fired.
+	vector< unsigned int > numFire_;
 
     /**
      * @brief RNG.
diff --git a/ksolve/Ksolve.cpp b/ksolve/Ksolve.cpp
index 508f2c85..56a85c4a 100644
--- a/ksolve/Ksolve.cpp
+++ b/ksolve/Ksolve.cpp
@@ -238,8 +238,8 @@ Ksolve::Ksolve()
 #elif USE_BOOST
     method_( "rk5a" ),
 #endif
-    epsAbs_( 1e-4 ),
-    epsRel_( 1e-6 ),
+    epsAbs_( 1e-7 ),
+    epsRel_( 1e-7 ),
     pools_( 1 ),
     startVoxel_( 0 ),
     dsolve_(),
@@ -368,8 +368,10 @@ void Ksolve::setStoich( Id stoich )
         ode.method = method_;
 #ifdef USE_GSL
         ode.gslSys.dimension = stoichPtr_->getNumAllPools();
-        if ( ode.gslSys.dimension == 0 )
+        if ( ode.gslSys.dimension == 0 ) {
+			stoichPtr_ = 0;
             return; // No pools, so don't bother.
+		}
         innerSetMethod( ode, method_ );
         ode.gslSys.function = &VoxelPools::gslFunc;
         ode.gslSys.jacobian = 0;
@@ -565,7 +567,8 @@ void Ksolve::process( const Eref& e, ProcPtr p )
 
 void Ksolve::reinit( const Eref& e, ProcPtr p )
 {
-    assert( stoichPtr_ );
+    if ( !stoichPtr_ )
+		return;
     if ( isBuilt_ )
     {
         for ( unsigned int i = 0 ; i < pools_.size(); ++i )
diff --git a/ksolve/OdeSystem.h b/ksolve/OdeSystem.h
index b629a82f..9a18191c 100644
--- a/ksolve/OdeSystem.h
+++ b/ksolve/OdeSystem.h
@@ -20,7 +20,7 @@ class OdeSystem {
     public:
         OdeSystem()
             : method( "rk5" ),
-            initStepSize( 1 ),
+            initStepSize( 0.001 ),
             epsAbs( 1e-6 ),
             epsRel( 1e-6 )
     {;}
diff --git a/ksolve/Stoich.cpp b/ksolve/Stoich.cpp
index e3788968..db6e2d77 100644
--- a/ksolve/Stoich.cpp
+++ b/ksolve/Stoich.cpp
@@ -1240,14 +1240,17 @@ const KinSparseMatrix& Stoich::getStoichiometryMatrix() const
 
 void Stoich::buildXreacs( const Eref& e, Id otherStoich )
 {
-    kinterface_->setupCrossSolverReacs( offSolverPoolMap_, otherStoich );
+	if ( status_ == 0 )
+    	kinterface_->setupCrossSolverReacs( offSolverPoolMap_,otherStoich);
 }
 
 void Stoich::filterXreacs()
 {
-    kinterface_->filterCrossRateTerms( offSolverReacVec_, offSolverReacCompts_ );
-    kinterface_->filterCrossRateTerms( offSolverEnzVec_, offSolverEnzCompts_ );
-    kinterface_->filterCrossRateTerms( offSolverMMenzVec_, offSolverMMenzCompts_ );
+	if ( status_ == 0 ) {
+    	kinterface_->filterCrossRateTerms( offSolverReacVec_, offSolverReacCompts_ );
+    	kinterface_->filterCrossRateTerms( offSolverEnzVec_, offSolverEnzCompts_ );
+    	kinterface_->filterCrossRateTerms( offSolverMMenzVec_, offSolverMMenzCompts_ );
+	}
 }
 
 /*
@@ -1491,32 +1494,36 @@ void Stoich::unZombifyModel()
 
     unZombifyPools();
 
-    for ( vector< Id >::iterator i = reacVec_.begin();
-            i != reacVec_.end(); ++i )
+	vector< Id > temp = reacVec_; temp.insert( temp.end(), 
+					offSolverReacVec_.begin(), offSolverReacVec_.end() );
+    for ( vector< Id >::iterator i = temp.begin(); i != temp.end(); ++i )
     {
         Element* e = i->element();
         if ( e != 0 &&  e->cinfo() == zombieReacCinfo )
             ReacBase::zombify( e, reacCinfo, Id() );
     }
 
-    for ( vector< Id >::iterator i = mmEnzVec_.begin();
-            i != mmEnzVec_.end(); ++i )
+	temp = mmEnzVec_; temp.insert( temp.end(), 
+					offSolverMMenzVec_.begin(), offSolverMMenzVec_.end() );
+    for ( vector< Id >::iterator i = temp.begin(); i != temp.end(); ++i )
     {
         Element* e = i->element();
         if ( e != 0 &&  e->cinfo() == zombieMMenzCinfo )
             EnzBase::zombify( e, mmEnzCinfo, Id() );
     }
 
-    for ( vector< Id >::iterator i = enzVec_.begin();
-            i != enzVec_.end(); ++i )
+	temp = enzVec_; temp.insert( temp.end(), 
+					offSolverEnzVec_.begin(), offSolverEnzVec_.end() );
+    for ( vector< Id >::iterator i = temp.begin(); i != temp.end(); ++i )
     {
         Element* e = i->element();
         if ( e != 0 &&  e->cinfo() == zombieEnzCinfo )
             CplxEnzBase::zombify( e, enzCinfo, Id() );
     }
 
-    for ( vector< Id >::iterator i = poolFuncVec_.begin();
-            i != poolFuncVec_.end(); ++i )
+	temp = poolFuncVec_; temp.insert( temp.end(), 
+		incrementFuncVec_.begin(), incrementFuncVec_.end() );
+    for ( vector< Id >::iterator i = temp.begin(); i != temp.end(); ++i )
     {
         Element* e = i->element();
         if ( e != 0 && e->cinfo() == zombieFunctionCinfo )
@@ -2282,7 +2289,7 @@ unsigned int Stoich::indexOfMatchingVolume( double vol ) const
 
 void Stoich::scaleBufsAndRates( unsigned int index, double volScale )
 {
-    if ( !kinterface_ )
+    if ( !kinterface_ || status_ != 0 )
         return;
     kinterface_->pools( index )->scaleVolsBufsRates( volScale, this );
 }
diff --git a/ksolve/VoxelPools.cpp b/ksolve/VoxelPools.cpp
index d1bd615f..3c933b0a 100644
--- a/ksolve/VoxelPools.cpp
+++ b/ksolve/VoxelPools.cpp
@@ -60,7 +60,7 @@ void VoxelPools::reinit( double dt )
 	if ( !driver_ )
 		return;
 	gsl_odeiv2_driver_reset( driver_ );
-	gsl_odeiv2_driver_reset_hstart( driver_, dt );
+	gsl_odeiv2_driver_reset_hstart( driver_, dt / 10.0 );
 #endif
 }
 
diff --git a/ksolve/ZombieEnz.cpp b/ksolve/ZombieEnz.cpp
index e8ded896..65c17a36 100644
--- a/ksolve/ZombieEnz.cpp
+++ b/ksolve/ZombieEnz.cpp
@@ -115,7 +115,17 @@ double ZombieEnz::vGetK2( const Eref& e ) const
 
 void ZombieEnz::vSetKcat( const Eref& e, double v )
 {
+	double k2 = getK2( e );
+	double k3 = getKcat( e );
+	double ratio = 4.0;
+	if ( k3 > 1e-10 )
+		ratio = k2/k3;
+	double Km = (k2 + k3) / concK1_;
+	concK1_ = v * (1.0 + ratio) / Km;
+
+	stoich_->setEnzK1( e, concK1_ );
 	stoich_->setEnzK3( e, v );
+	stoich_->setEnzK2( e, v * ratio );
 }
 
 double ZombieEnz::vGetKcat( const Eref& e ) const
diff --git a/ksolve/ZombieFunction.cpp b/ksolve/ZombieFunction.cpp
index 7760c4f1..d430573f 100644
--- a/ksolve/ZombieFunction.cpp
+++ b/ksolve/ZombieFunction.cpp
@@ -114,7 +114,9 @@ ZombieFunction::~ZombieFunction()
 // MsgDest Definitions
 //////////////////////////////////////////////////////////////
 void ZombieFunction::process(const Eref &e, ProcPtr p)
-{;}
+{
+	_t = p->currTime;
+}
 
 void ZombieFunction::reinit(const Eref &e, ProcPtr p)
 {;}
diff --git a/pymoose/moosemodule.cpp b/pymoose/moosemodule.cpp
index b0653fc0..97f66dad 100644
--- a/pymoose/moosemodule.cpp
+++ b/pymoose/moosemodule.cpp
@@ -1792,6 +1792,7 @@ PyObject * moose_exists(PyObject * dummy, PyObject * args)
 }
 
 //Harsha : For writing genesis file to sbml
+/*
 PyObject * moose_writeSBML(PyObject * dummy, PyObject * args)
 {
     char * fname = NULL, * modelpath = NULL;
@@ -1802,7 +1803,8 @@ PyObject * moose_writeSBML(PyObject * dummy, PyObject * args)
     int ret = SHELLPTR->doWriteSBML(string(modelpath),string(fname));
     return Py_BuildValue("i", ret);
 }
-
+*/
+/*
 PyObject * moose_readSBML(PyObject * dummy, PyObject * args)
 {
     char * fname = NULL, * modelpath = NULL, * solverclass = NULL;
@@ -1832,7 +1834,7 @@ PyObject * moose_readSBML(PyObject * dummy, PyObject * args)
     PyObject * ret = reinterpret_cast<PyObject*>(model);
     return ret;
 }
-
+*/
 PyDoc_STRVAR(moose_loadModel_documentation,
              "loadModel(filename, modelpath, solverclass) -> vec\n"
              "\n"
@@ -3073,8 +3075,8 @@ static PyMethodDef MooseMethods[] =
     {"stop", (PyCFunction)moose_stop, METH_VARARGS, "Stop simulation"},
     {"isRunning", (PyCFunction)moose_isRunning, METH_VARARGS, "True if the simulation is currently running."},
     {"exists", (PyCFunction)moose_exists, METH_VARARGS, "True if there is an object with specified path."},
-    {"writeSBML", (PyCFunction)moose_writeSBML, METH_VARARGS, "Export biochemical model to an SBML file."},
-    {"readSBML",  (PyCFunction)moose_readSBML,  METH_VARARGS, "Import SBML model to Moose."},
+    //{"writeSBML", (PyCFunction)moose_writeSBML, METH_VARARGS, "Export biochemical model to an SBML file."},
+    //{"readSBML",  (PyCFunction)moose_readSBML,  METH_VARARGS, "Import SBML model to Moose."},
     {"loadModel", (PyCFunction)moose_loadModel, METH_VARARGS, moose_loadModel_documentation},
     {"saveModel", (PyCFunction)moose_saveModel, METH_VARARGS, moose_saveModel_documentation},
     {"connect", (PyCFunction)moose_connect, METH_VARARGS, moose_connect_documentation},
diff --git a/python/moose/SBML/__init__.py b/python/moose/SBML/__init__.py
index a3c3042f..a3b90826 100755
--- a/python/moose/SBML/__init__.py
+++ b/python/moose/SBML/__init__.py
@@ -1,4 +1,4 @@
-from  writeSBML import mooseWriteSBML
-from  readSBML import mooseReadSBML
+from  .writeSBML import mooseWriteSBML
+from  .readSBML import mooseReadSBML
 
 __all__ = ["mooseWriteSBML","mooseReadSBML"]
diff --git a/python/moose/SBML/readSBML.py b/python/moose/SBML/readSBML.py
index 08055a60..2deda273 100644
--- a/python/moose/SBML/readSBML.py
+++ b/python/moose/SBML/readSBML.py
@@ -12,7 +12,7 @@
 **           copyright (C) 2003-2016 Upinder S. Bhalla. and NCBS
 Created : Thu May 12 10:19:00 2016(+0530)
 Version 
-Last-Updated:
+Last-Updated: Wed Sep 28
 		  By:
 **********************************************************************/
 /****************************
@@ -23,7 +23,7 @@ import sys
 import os.path
 import collections
 from moose import *
-import libsbml
+#import libsbml
 
 '''
    TODO in
@@ -49,77 +49,90 @@ import libsbml
         	 ----when stoichiometry is rational number 22
 	 	---- For Michaelis Menten kinetics km is not defined which is most of the case need to calculate
 '''
-
-def mooseReadSBML(filepath,loadpath):
-	print " filepath ",filepath
-	try:
-		filep = open(filepath, "r")
-		document = libsbml.readSBML(filepath)
-		num_errors = document.getNumErrors()
-		if ( num_errors > 0 ):
-			print("Encountered the following SBML errors:" );
-			document.printErrors();
-			return moose.element('/');
-		else:
-			level = document.getLevel();
-			version = document.getVersion();
-			print("\n" + "File: " + filepath + " (Level " + str(level) + ", version " + str(version) + ")" );
-			model = document.getModel();
-			if (model == None):
-				print("No model present." );
+try: 
+    #from libsbml import *
+    import libsbml
+except ImportError: 
+    def mooseReadSBML(filepath,loadpath,solver="ee"):
+    	return (-2,"\n ReadSBML : python-libsbml module not installed",None)
+else:
+	def mooseReadSBML(filepath,loadpath,solver="ee"):
+		try:
+			filep = open(filepath, "r")
+			document = libsbml.readSBML(filepath)
+			num_errors = document.getNumErrors()
+			if ( num_errors > 0 ):
+				print("Encountered the following SBML errors:" );
+				document.printErrors();
 				return moose.element('/');
 			else:
-				print " model ",model
-				print("functionDefinitions: " + str(model.getNumFunctionDefinitions()) );
-				print("    unitDefinitions: " + str(model.getNumUnitDefinitions()) );
-				print("   compartmentTypes: " + str(model.getNumCompartmentTypes()) );
-				print("        specieTypes: " + str(model.getNumSpeciesTypes()) );
-				print("       compartments: " + str(model.getNumCompartments()) );
-				print("            species: " + str(model.getNumSpecies()) );
-				print("         parameters: " + str(model.getNumParameters()) );
-				print(" initialAssignments: " + str(model.getNumInitialAssignments()) );
-				print("              rules: " + str(model.getNumRules()) );
-				print("        constraints: " + str(model.getNumConstraints()) );
-				print("          reactions: " + str(model.getNumReactions()) );
-				print("             events: " + str(model.getNumEvents()) );
-				print("\n");
-
-				if (model.getNumCompartments() == 0):
-					return moose.element('/')
+				level = document.getLevel();
+				version = document.getVersion();
+				print(("\n" + "File: " + filepath + " (Level " + str(level) + ", version " + str(version) + ")" ));
+				model = document.getModel();
+				if (model == None):
+					print("No model present." );
+					return moose.element('/');
 				else:
-					baseId = moose.Neutral(loadpath)
-					#All the model will be created under model as a thumbrule
-					basePath = moose.Neutral(baseId.path+'/model')
-					#Map Compartment's SBML id as key and value is list of[ Moose ID and SpatialDimensions ]
-					comptSbmlidMooseIdMap = {}
-					print ": ",basePath.path
-					globparameterIdValue = {}
-					modelAnnotaInfo = {}
-					mapParameter(model,globparameterIdValue)
-					errorFlag = createCompartment(basePath,model,comptSbmlidMooseIdMap)
-					if errorFlag:
-						specInfoMap = {}
-						errorFlag = createSpecies(basePath,model,comptSbmlidMooseIdMap,specInfoMap)
+					print((" model: " +str(model)));
+					print(("functionDefinitions: " + str(model.getNumFunctionDefinitions()) ));
+					print(("    unitDefinitions: " + str(model.getNumUnitDefinitions()) ));
+					print(("   compartmentTypes: " + str(model.getNumCompartmentTypes()) ));
+					print(("        specieTypes: " + str(model.getNumSpeciesTypes()) ));
+					print(("       compartments: " + str(model.getNumCompartments()) ));
+					print(("            species: " + str(model.getNumSpecies()) ));
+					print(("         parameters: " + str(model.getNumParameters()) ));
+					print((" initialAssignments: " + str(model.getNumInitialAssignments()) ));
+					print(("              rules: " + str(model.getNumRules()) ));
+					print(("        constraints: " + str(model.getNumConstraints()) ));
+					print(("          reactions: " + str(model.getNumReactions()) ));
+					print(("             events: " + str(model.getNumEvents()) ));
+					print("\n");
+
+					if (model.getNumCompartments() == 0):
+						return moose.element('/')
+					else:
+						baseId = moose.Neutral(loadpath)
+						#All the model will be created under model as a thumbrule
+						basePath = moose.Neutral(baseId.path+'/model')
+						#Map Compartment's SBML id as key and value is list of[ Moose ID and SpatialDimensions ]
+						global comptSbmlidMooseIdMap
+						global warning
+						warning = " "
+						comptSbmlidMooseIdMap = {}
+						print(("modelPath:" + basePath.path))
+						globparameterIdValue = {}
+						modelAnnotaInfo = {}
+						mapParameter(model,globparameterIdValue)
+						errorFlag = createCompartment(basePath,model,comptSbmlidMooseIdMap)
 						if errorFlag:
-							errorFlag = createRules(model,specInfoMap,globparameterIdValue)
+							specInfoMap = {}
+							errorFlag = createSpecies(basePath,model,comptSbmlidMooseIdMap,specInfoMap,modelAnnotaInfo)
 							if errorFlag:
-								errorFlag = createReaction(model,specInfoMap,modelAnnotaInfo)
-					if not errorFlag:
-						print " errorFlag ",errorFlag
-						#Any time in the middle if SBML does not read then I delete everything from model level
-						#This is important as while reading in GUI the model will show up untill built which is not correct
-						print "Deleted rest of the model"
-						moose.delete(basePath)
-				return baseId;
-
+								errorFlag = createRules(model,specInfoMap,globparameterIdValue)
+								if errorFlag:
+									errorFlag,msg = createReaction(model,specInfoMap,modelAnnotaInfo,globparameterIdValue)
+							getModelAnnotation(model,baseId,basePath)
+						
+						if not errorFlag:
+							print(msg)
+							#Any time in the middle if SBML does not read then I delete everything from model level
+							#This is important as while reading in GUI the model will show up untill built which is not correct
+							#print "Deleted rest of the model"
+							moose.delete(basePath)
+					return baseId;
+
+
+		except IOError:
+			print("File " ,filepath ," does not exist.")
+			return moose.element('/')
 
-	except IOError:
-		print "File " ,filepath ," does not exist."
-		return moose.element('/')
 def setupEnzymaticReaction(enz,groupName,enzName,specInfoMap,modelAnnotaInfo):
 	enzPool = (modelAnnotaInfo[groupName]["enzyme"])
+	enzPool = str(idBeginWith(enzPool))
 	enzParent = specInfoMap[enzPool]["Mpath"]
 	cplx = (modelAnnotaInfo[groupName]["complex"])
+	cplx = str(idBeginWith(cplx))
 	complx = moose.element(specInfoMap[cplx]["Mpath"].path)
 	
 	enzyme_ = moose.Enz(enzParent.path+'/'+enzName)
@@ -132,20 +145,23 @@ def setupEnzymaticReaction(enz,groupName,enzName,specInfoMap,modelAnnotaInfo):
 
 	for si in range(0,len(sublist)):
 		sl = sublist[si]
+		sl = str(idBeginWith(sl))
 		mSId =specInfoMap[sl]["Mpath"]
 		moose.connect(enzyme_,"sub",mSId,"reac")
 
 	for pi in range(0,len(prdlist)):
 		pl = prdlist[pi]
+		pl = str(idBeginWith(pl))
 		mPId = specInfoMap[pl]["Mpath"]
 		moose.connect(enzyme_,"prd",mPId,"reac")
 	
 	if (enz.isSetNotes):
 		pullnotes(enz,enzyme_)
 
+	return enzyme_,True
+
 def addSubPrd(reac,reName,type,reactSBMLIdMooseId,specInfoMap):
 	rctMapIter = {}
-
 	if (type == "sub"):
 		noplusStoichsub = 0
 		addSubinfo = collections.OrderedDict()
@@ -153,8 +169,12 @@ def addSubPrd(reac,reName,type,reactSBMLIdMooseId,specInfoMap):
 			rct = reac.getReactant(rt)
 			sp = rct.getSpecies()
 			rctMapIter[sp] = rct.getStoichiometry()
+			if rct.getStoichiometry() > 1:
+				pass
+				#print " stoich ",reac.name,rct.getStoichiometry()
 			noplusStoichsub = noplusStoichsub+rct.getStoichiometry()
-		for key,value in rctMapIter.items():
+		for key,value in list(rctMapIter.items()):
+			key = str(idBeginWith(key))
 			src = specInfoMap[key]["Mpath"]
 			des = reactSBMLIdMooseId[reName]["MooseId"]
 			for s in range(0,int(value)):
@@ -169,11 +189,15 @@ def addSubPrd(reac,reName,type,reactSBMLIdMooseId,specInfoMap):
 			rct = reac.getProduct(rt)
 			sp = rct.getSpecies()
 			rctMapIter[sp] = rct.getStoichiometry()
+			if rct.getStoichiometry() > 1:
+				pass
+				#print " stoich prd",reac.name,rct.getStoichiometry()
 			noplusStoichprd = noplusStoichprd+rct.getStoichiometry()
 		
-		for key,values in rctMapIter.items():
+		for key,values in list(rctMapIter.items()):
 			#src ReacBase
 			src = reactSBMLIdMooseId[reName]["MooseId"]
+			key = parentSp = str(idBeginWith(key))
 			des = specInfoMap[key]["Mpath"]
 			for i in range(0,int(values)):
 				moose.connect(src, 'prd', des, 'reac', 'OneToOne')
@@ -181,13 +205,88 @@ def addSubPrd(reac,reName,type,reactSBMLIdMooseId,specInfoMap):
 		reactSBMLIdMooseId[reName].update(addPrdinfo)
 
 def populatedict(annoDict,label,value):
-	if annoDict.has_key(label):
+	if label in annoDict:
 		annoDict.setdefault(label,[])
 		annoDict[label].update({value})
 	else:
 		annoDict[label]= {value}
 
-def getModelAnnotation(obj,modelAnnotaInfo):
+def getModelAnnotation(obj,baseId,basepath):
+	annotationNode = obj.getAnnotation()
+	if annotationNode != None:
+		numchild = annotationNode.getNumChildren()
+		for child_no in range(0,numchild):
+			childNode = annotationNode.getChild( child_no )
+			if ( childNode.getPrefix() == "moose" and childNode.getName() == "ModelAnnotation" ):
+				num_gchildren = childNode.getNumChildren()
+				for gchild_no in range(0,num_gchildren):
+					grandChildNode = childNode.getChild(gchild_no)
+					nodeName = grandChildNode.getName()
+					if (grandChildNode.getNumChildren() == 1 ):
+						baseinfo = moose.Annotator(baseId.path+'/info')
+						baseinfo.modeltype = "xml"
+						if nodeName == "runTime":
+							runtime = float((grandChildNode.getChild(0).toXMLString()))
+							baseinfo.runtime = runtime
+						if nodeName == "solver":
+							solver = (grandChildNode.getChild(0).toXMLString())
+							baseinfo.solver = solver
+						if(nodeName == "plots"):
+							plotValue = (grandChildNode.getChild(0).toXMLString())
+							p = moose.element(baseId)
+							datapath = moose.element(baseId).path +"/data"
+							if not moose.exists(datapath):
+								datapath = moose.Neutral(baseId.path+"/data")
+								graph = moose.Neutral(datapath.path+"/graph_0")
+								plotlist= plotValue.split(";")
+								tablelistname = []
+								for plots in plotlist:
+									plots = plots.replace(" ", "")
+									plotorg = plots
+									if moose.exists(basepath.path+plotorg):
+										plotSId = moose.element(basepath.path+plotorg)
+										#plotorg = convertSpecialChar(plotorg)
+										plot2 = plots.replace('/','_')
+										plot3 = plot2.replace('[','_')
+										plotClean = plot3.replace(']','_')
+										plotName =  plotClean + ".conc"
+										fullPath = graph.path+'/'+plotName.replace(" ","")
+										#If table exist with same name then its not created
+										if not fullPath  in tablelistname:
+											tab = moose.Table2(fullPath)
+											tablelistname.append(fullPath)
+											moose.connect(tab,"requestOut",plotSId,"getConc")
+
+def getObjAnnotation(obj,modelAnnotationInfo):
+	name = obj.getId()
+	name = name.replace(" ","_space_")
+	#modelAnnotaInfo= {}
+	annotateMap = {}
+	if (obj.getAnnotation() != None):
+		annoNode = obj.getAnnotation()
+		for ch in range(0,annoNode.getNumChildren()):
+			childNode = annoNode.getChild(ch)
+			if (childNode.getPrefix() == "moose" and (childNode.getName() == "ModelAnnotation" or childNode.getName() == "EnzymaticReaction")):
+				sublist = []
+				for gch in range(0,childNode.getNumChildren()):
+					grandChildNode = childNode.getChild(gch)
+					nodeName = grandChildNode.getName()
+					nodeValue = ""
+					if (grandChildNode.getNumChildren() == 1):
+						nodeValue = grandChildNode.getChild(0).toXMLString()
+					else:
+						print("Error: expected exactly ONE child of ", nodeName)
+					
+					if nodeName == "xCord":
+						annotateMap[nodeName] = nodeValue
+					if nodeName == "yCord":
+						annotateMap[nodeName] = nodeValue
+					if nodeName == "bgColor":
+							annotateMap[nodeName] = nodeValue
+					if nodeName == "textColor":
+						annotateMap[nodeName] = nodeValue
+	return annotateMap
+def getEnzAnnotation(obj,modelAnnotaInfo,rev,globparameterIdValue,specInfoMap):
 	name = obj.getId()
 	name = name.replace(" ","_space_")
 	#modelAnnotaInfo= {}
@@ -205,7 +304,7 @@ def getModelAnnotation(obj,modelAnnotaInfo):
 					if (grandChildNode.getNumChildren() == 1):
 						nodeValue = grandChildNode.getChild(0).toXMLString()
 					else:
-						print "Error: expected exactly ONE child of ", nodeName
+						print("Error: expected exactly ONE child of ", nodeName)
 					
 					if nodeName == "enzyme":
 						populatedict(annotateMap,"enzyme",nodeValue)
@@ -227,45 +326,64 @@ def getModelAnnotation(obj,modelAnnotaInfo):
 					elif ( nodeName == "yCord" ):
 						populatedict(annotateMap,"yCord" ,nodeValue)
 	groupName = ""
-	if annotateMap.has_key('grpName'):
+	if 'grpName' in annotateMap:
 		groupName = list(annotateMap["grpName"])[0]
+		klaw=obj.getKineticLaw();
+		mmsg = ""
+		errorFlag, mmsg,k1,k2 = getKLaw(obj,klaw,rev,globparameterIdValue,specInfoMap)
+
+		if 'substrates' in annotateMap:
+		    sublist = list(annotateMap["substrates"])
+		else:
+			sublist = {}
+		if 'product' in annotateMap:
+		    prdlist = list(annotateMap["product"])
+		else:
+			prdlist = {}
+
 		if list(annotateMap["stage"])[0] == '1':
-			if modelAnnotaInfo.has_key(groupName):
+			if groupName in modelAnnotaInfo:
 				modelAnnotaInfo[groupName].update	(
 					{"enzyme" : list(annotateMap["enzyme"])[0],
 					"stage" : list(annotateMap["stage"])[0],
-					"substrate" : list(annotateMap["substrates"])
+					"substrate" : sublist,
+					"k1": k1,
+					"k2" : k2
 					}
 				)
 			else:
 				modelAnnotaInfo[groupName]= {
 					"enzyme" : list(annotateMap["enzyme"])[0],
 					"stage" : list(annotateMap["stage"])[0],
-					"substrate" : list(annotateMap["substrates"])
+					"substrate" :  sublist,
+					"k1" : k1,
+					"k2" : k2
 					#"group" : list(annotateMap["Group"])[0],
 					#"xCord" : list(annotateMap["xCord"])[0],
 					#"yCord" : list(annotateMap["yCord"]) [0]
 					}
 
 		elif list(annotateMap["stage"])[0] == '2':
-			if modelAnnotaInfo.has_key(groupName):
+			if groupName in modelAnnotaInfo:
 				stage = int(modelAnnotaInfo[groupName]["stage"])+int(list(annotateMap["stage"])[0])
 				modelAnnotaInfo[groupName].update (
 					{"complex" : list(annotateMap["complex"])[0],
-					"product" : list(annotateMap["product"]),
-					"stage" : [stage]
+					"product" : prdlist,
+					"stage" : [stage],
+					"k3" : k1
 					}
 				)
 			else:
 				modelAnnotaInfo[groupName]= {
 					"complex" : list(annotateMap["complex"])[0],
-					"product" : list(annotateMap["product"]),
-					"stage" : [stage]
+					"product" : prdlist,
+					"stage" : [stage],
+					"k3" : k1
 					}
 	return(groupName)
 
 
-def createReaction(model,specInfoMap,modelAnnotaInfo):
+def createReaction(model,specInfoMap,modelAnnotaInfo,globparameterIdValue):
 	# print " reaction "
 	# Things done for reaction
 	# --Reaction is not created, if substrate and product is missing
@@ -277,7 +395,10 @@ def createReaction(model,specInfoMap,modelAnnotaInfo):
 
 	errorFlag = True
 	reactSBMLIdMooseId = {}
-
+	msg = ""
+	rName = ""
+	reaction_ = ""
+	
 	for ritem in range(0,model.getNumReactions()):
 		reactionCreated = False
 		groupName = ""
@@ -292,12 +413,35 @@ def createReaction(model,specInfoMap,modelAnnotaInfo):
 		rev = reac.getReversible()
 		fast = reac.getFast()
 		if ( fast ):
-			print " warning: for now fast attribute is not handled \"", rName,"\""
+			print(" warning: for now fast attribute is not handled \"", rName,"\"")
 		if (reac.getAnnotation() != None):
-			groupName = getModelAnnotation(reac,modelAnnotaInfo)
+			groupName = getEnzAnnotation(reac,modelAnnotaInfo,rev,globparameterIdValue,specInfoMap)
 			
 		if (groupName != "" and list(modelAnnotaInfo[groupName]["stage"])[0] == 3):
-			setupEnzymaticReaction(reac,groupName,rName,specInfoMap,modelAnnotaInfo)
+			reaction_,reactionCreated = setupEnzymaticReaction(reac,groupName,rName,specInfoMap,modelAnnotaInfo)
+			reaction_.k3 = modelAnnotaInfo[groupName]['k3']
+			reaction_.k2 = modelAnnotaInfo[groupName]['k2']
+			reaction_.concK1 = modelAnnotaInfo[groupName]['k1']
+			if reactionCreated:
+				if (reac.isSetNotes):
+					pullnotes(reac,reaction_)
+					reacAnnoInfo = {}
+				reacAnnoInfo = getObjAnnotation(reac,modelAnnotaInfo)
+				if reacAnnoInfo:
+					if not moose.exists(reaction_.path+'/info'):
+						reacInfo = moose.Annotator(reaction_.path+'/info')
+					else:
+						reacInfo = moose.element(reaction_.path+'/info')
+					for k,v in list(reacAnnoInfo.items()):
+						if k == 'xCord':
+							reacInfo.x = float(v)
+						elif k == 'yCord':
+							reacInfo.y = float(v)
+						elif k == 'bgColor':
+							reacInfo.color = v
+						else:
+							reacInfo.textColor = v
+
 
 		elif(groupName == ""):
 			numRcts = reac.getNumReactants()
@@ -305,18 +449,22 @@ def createReaction(model,specInfoMap,modelAnnotaInfo):
 			nummodifiers = reac.getNumModifiers()
 			
 			if not (numRcts and numPdts):
-				print rName," : Substrate and Product is missing, we will be skiping creating this reaction in MOOSE"
-			
+				print(rName," : Substrate and Product is missing, we will be skiping creating this reaction in MOOSE")
+				reactionCreated = False
 			elif (reac.getNumModifiers() > 0):
-				reactionCreated = setupMMEnzymeReaction(reac,rName,specInfoMap,reactSBMLIdMooseId)
-				print " reactionCreated after enz ",reactionCreated
-
+				reactionCreated,reaction_ = setupMMEnzymeReaction(reac,rName,specInfoMap,reactSBMLIdMooseId,modelAnnotaInfo,model,globparameterIdValue)
+			# elif (reac.getNumModifiers() > 0):
+			# 	reactionCreated = setupMMEnzymeReaction(reac,rName,specInfoMap,reactSBMLIdMooseId,modelAnnotaInfo,model,globparameterIdValue)
+			# 	reaction_ = reactSBMLIdMooseId['classical']['MooseId']
+			# 	reactionType = "MMEnz"
 			elif (numRcts):
 				# In moose, reactions compartment are decided from first Substrate compartment info
 				# substrate is missing then check for product
 				if (reac.getNumReactants()):
-					react = reac.getReactant(0)
+					react = reac.getReactant(reac.getNumReactants()-1)
+					#react = reac.getReactant(0)
 					sp = react.getSpecies()
+					sp = str(idBeginWith(sp))
 					speCompt = specInfoMap[sp]["comptId"].path
 					reaction_ = moose.Reac(speCompt+'/'+rName)
 					reactionCreated = True
@@ -327,24 +475,115 @@ def createReaction(model,specInfoMap,modelAnnotaInfo):
 				if (reac.getNumProducts()):
 					react = reac.getProducts(0)
 					sp = react.getSpecies()
+					sp = str(idBeginWith(sp))
 					speCompt = specInfoMap[sp]["comptId"].path
 					reaction_ = moose.Reac(speCompt+'/'+rName)
 					reactionCreated = True
-					reactSBMLIdMooseId[rName] = {"MooseId":reaction_}
-
+					reactSBMLIdMooseId[rId] = {"MooseId" : reaction_, "className": "reaction"}
 			if reactionCreated:
 				if (reac.isSetNotes):
 					pullnotes(reac,reaction_)
+					reacAnnoInfo = {}
+				reacAnnoInfo = getObjAnnotation(reac,modelAnnotaInfo)
+				if reacAnnoInfo:
+					if not moose.exists(reaction_.path+'/info'):
+						reacInfo = moose.Annotator(reaction_.path+'/info')
+					else:
+						reacInfo = moose.element(reaction_.path+'/info')
+					for k,v in list(reacAnnoInfo.items()):
+						if k == 'xCord':
+							reacInfo.x = float(v)
+						elif k == 'yCord':
+							reacInfo.y = float(v)
+						elif k == 'bgColor':
+							reacInfo.color = v
+						else:
+							reacInfo.textColor = v
+
 				addSubPrd(reac,rName,"sub",reactSBMLIdMooseId,specInfoMap)
 				addSubPrd(reac,rName,"prd",reactSBMLIdMooseId,specInfoMap)
-	# print "react ",reactSBMLIdMooseId
-	return errorFlag
+				if reac.isSetKineticLaw():
+					klaw=reac.getKineticLaw();
+					mmsg = ""
+					errorFlag, mmsg,kfvalue,kbvalue = getKLaw(model,klaw,rev,globparameterIdValue,specInfoMap)
+					if not errorFlag:
+						msg = "Error while importing reaction \""+rName+"\"\n Error in kinetics law "
+						if mmsg != "":
+							msg = msg+mmsg
+						return(errorFlag,msg)
+					else:
+						#print " reactSBMLIdMooseId ",reactSBMLIdMooseId[rName]["nSub"], " prd ",reactSBMLIdMooseId[rName]["nPrd"]
+						if reaction_.className == "Reac":
+							subn = reactSBMLIdMooseId[rName]["nSub"]
+							prdn = reactSBMLIdMooseId[rName]["nPrd"]
+							reaction_.Kf = kfvalue #* pow(1e-3,subn-1)
+							reaction_.Kb = kbvalue #* pow(1e-3,prdn-1)
+						elif reaction_.className == "MMenz":
+							reaction_.kcat  = kfvalue
+							reaction_.Km = kbvalue
+	return (errorFlag,msg)
+
+def getKLaw( model, klaw, rev,globparameterIdValue,specMapList):
+    parmValueMap = {}
+    amt_Conc = "amount";
+    value = 0.0
+    np = klaw. getNumParameters();
+    for pi in range(0, np):
+        p = klaw.getParameter(pi)
+        if ( p.isSetId() ):
+            ids = p.getId()
+        if ( p.isSetValue() ):
+            value = p.getValue()
+        parmValueMap[ids] = value
+    ruleMemlist = []
+    flag = getMembers(klaw.getMath(),ruleMemlist)
+    index = 0 
+    kfparm = ""
+    kbparm = ""
+    kfvalue = 0
+    kbvalue = 0 
+    kfp = ""
+    kbp = ""
+    mssgstr =  ""
+    for i in ruleMemlist:
+    	if i in parmValueMap or i in globparameterIdValue:
+    		if index == 0:
+    			kfparm = i
+    			if i in parmValueMap:
+    				kfvalue = parmValueMap[i]
+    				kfp = klaw.getParameter(kfparm)
+    			else:
+    				kfvalue = globparameterIdValue[i]
+    				kfp = model.getParameter(kfparm)
+    		elif index == 1:
+    			kbparm = i
+    			if i in parmValueMap:
+    				kbvalue = parmValueMap[i]
+    				kbp = klaw.getParameter(kbparm)
+    			else:
+    				kbvalue = globparameterIdValue[i]
+    				kbp = model.getParameter(kbparm)
+    		index += 1
+
+    	elif not (i in specMapList or i in comptSbmlidMooseIdMap):
+    		mssgstr = "\""+i+ "\" is not defined "
+    		return ( False, mssgstr)
+    if kfp != "":
+    	#print " unit set for rate law kfp ",kfparm, " ",kfp.isSetUnits()
+    	if kfp.isSetUnits():
+    		kfud = kfp.getDerivedUnitDefinition();
+    		#print " kfud ",kfud
+    if kbp != "":
+    	pass
+    	#print " unit set for rate law kbp ",kbparm, " ",kbp.isSetUnits()
+
+    return (True,mssgstr,kfvalue,kbvalue)
 
 def getMembers(node,ruleMemlist):
 	if node.getType() == libsbml.AST_PLUS:
 		if node.getNumChildren() == 0:
 			print ("0")
-			return
+			return False
 		getMembers(node.getChild(0),ruleMemlist)
 		for i in range(1,node.getNumChildren()):
 			# addition
@@ -355,61 +594,117 @@ def getMembers(node,ruleMemlist):
 	elif node.getType() == libsbml.AST_NAME:
 		#This will be the ci term"
 		ruleMemlist.append(node.getName())
-
+	elif node.getType() == libsbml.AST_MINUS:
+		if node.getNumChildren() == 0:
+			print("0")
+			return False
+		else:
+			lchild = node.getLeftChild();
+			getMembers(lchild,ruleMemlist)
+			rchild = node.getRightChild();
+			getMembers(rchild,ruleMemlist)
+	elif node.getType() == libsbml.AST_DIVIDE:
+		
+		if node.getNumChildren() == 0:
+			print("0")
+			return False
+		else:
+			lchild = node.getLeftChild();
+			getMembers(lchild,ruleMemlist)
+			rchild = node.getRightChild();
+			getMembers(rchild,ruleMemlist)
+	
 	elif node.getType() == libsbml.AST_TIMES:
 		if node.getNumChildren() == 0:
 			print ("0")
-			return
+			return False
 		getMembers(node.getChild(0),ruleMemlist)
 		for i in range(1,node.getNumChildren()):
 			# Multiplication
 			getMembers(node.getChild(i),ruleMemlist)
+	
+	elif node.getType() == libsbml.AST_FUNCTION_POWER:
+		pass
 	else:
-		print " this case need to be handled"
+
+		print(" this case need to be handled",node.getType())
+	# if len(ruleMemlist) > 2:
+	# 	print "Sorry! for now MOOSE cannot handle more than 2 parameters"
+ #        return True
 
 def createRules(model,specInfoMap,globparameterIdValue):
 	for r in range(0,model.getNumRules()):
-			rule = model.getRule(r)
-			if (rule.isAssignment()):
-				rule_variable = rule.getVariable();
-				poolList = specInfoMap[rule_variable]["Mpath"].path
-				funcId = moose.Function(poolList+'/func')
+		rule = model.getRule(r)
+		comptvolume = []
+		if (rule.isAssignment()):
+			rule_variable = rule.getVariable();
+			rule_variable = parentSp = str(idBeginWith(rule_variable))
+			poolList = specInfoMap[rule_variable]["Mpath"].path
+			poolsCompt = findCompartment(moose.element(poolList))
+			if not isinstance(moose.element(poolsCompt),moose.ChemCompt):
+				return -2
+			else:
+				if poolsCompt.name not in comptvolume:
+					comptvolume.append(poolsCompt.name)
+
+			funcId = moose.Function(poolList+'/func')
+			
+			objclassname = moose.element(poolList).className
+			if  objclassname == "BufPool" or objclassname == "ZombieBufPool":
 				moose.connect( funcId, 'valueOut', poolList ,'setN' )
-				ruleMath = rule.getMath()
-				ruleMemlist = []
-				speFunXterm = {}
-				getMembers(ruleMath,ruleMemlist)
-				for i in ruleMemlist:
-					if (specInfoMap.has_key(i)):
-						specMapList = specInfoMap[i]["Mpath"]
-						numVars = funcId.numVars
-						x = funcId.path+'/x['+str(numVars)+']'
-						speFunXterm[i] = 'x'+str(numVars)
-						moose.connect(specMapList , 'nOut', x, 'input' )
-						funcId.numVars = numVars +1
-					elif not(globparameterIdValue.has_key(i)):
-						print "check the variable type ",i
-
-				exp = rule.getFormula()
-				for mem in ruleMemlist:
-					if ( specInfoMap.has_key(mem)):
-						exp1 = exp.replace(mem,str(speFunXterm[mem]))
-						exp = exp1
-					elif( globparameterIdValue.has_key(mem)):
-						exp1 = exp.replace(mem,str(globparameterIdValue[mem]))
-						exp = exp1
+			elif  objclassname == "Pool" or objclassname == "ZombiePool":
+				#moose.connect( funcId, 'valueOut', poolList ,'increament' )
+				moose.connect(funcId, 'valueOut', poolList ,'setN' )
+			elif  objclassname == "Reac" or objclassname == "ZombieReac":
+				moose.connect( funcId, 'valueOut', poolList ,'setNumkf' )	
+			
+			ruleMath = rule.getMath()
+			ruleMemlist = []
+			speFunXterm = {}
+			getMembers(ruleMath,ruleMemlist)
+			
+			for i in ruleMemlist:
+
+				if (i in specInfoMap):
+					i = str(idBeginWith(i))
+					specMapList = specInfoMap[i]["Mpath"]
+					poolsCompt = findCompartment(moose.element(specMapList))
+					if not isinstance(moose.element(poolsCompt),moose.ChemCompt):
+						return -2
 					else:
-						print "Math expression need to be checked"
-				funcId.expr = exp.strip(" \t\n\r")
-				return True
-
-			elif( rule.isRate() ):
-				print "Warning : For now this \"",rule.getVariable(), "\" rate Rule is not handled in moose "
-				return False
-
-			elif ( rule.isAlgebraic() ):
-				print "Warning: For now this " ,rule.getVariable()," Algebraic Rule is not handled in moose"
-				return False
+						if poolsCompt.name not in comptvolume:
+							comptvolume.append(poolsCompt.name)
+					numVars = funcId.numVars
+					x = funcId.path+'/x['+str(numVars)+']'
+					speFunXterm[i] = 'x'+str(numVars)
+					moose.connect(specMapList , 'nOut', x, 'input' )
+					funcId.numVars = numVars +1
+
+				elif not(i in globparameterIdValue):
+					print("check the variable type ",i)
+			exp = rule.getFormula()
+			for mem in ruleMemlist:
+				if ( mem in specInfoMap):
+					exp1 = exp.replace(mem,str(speFunXterm[mem]))
+					exp = exp1
+				elif( mem in globparameterIdValue):
+					exp1 = exp.replace(mem,str(globparameterIdValue[mem]))
+					exp = exp1
+				else:
+					print("Math expression need to be checked")
+			exp = exp.replace(" ","")
+			funcId.expr = exp.strip(" \t\n\r")
+			#return True
+
+		elif( rule.isRate() ):
+			print("Warning : For now this \"",rule.getVariable(), "\" rate Rule is not handled in moose ")
+			#return False
+
+		elif ( rule.isAlgebraic() ):
+			print("Warning: For now this " ,rule.getVariable()," Algebraic Rule is not handled in moose")
+			#return False
+		if len(comptvolume) >1:
+			warning = "\nFunction ",moose.element(poolList).name," has input from different compartment which is depricated in moose and running this model cause moose to crash"
 	return True
 
 def pullnotes(sbmlId,mooseId):
@@ -424,11 +719,12 @@ def pullnotes(sbmlId,mooseId):
 			objInfo = moose.element(mooseId.path+'/info')
 		objInfo.notes = notes
 
-def createSpecies(basePath,model,comptSbmlidMooseIdMap,specInfoMap):
+def createSpecies(basePath,model,comptSbmlidMooseIdMap,specInfoMap,modelAnnotaInfo):
 	# ToDo:
 	# - Need to add group name if exist in pool
 	# - Notes
 	# print "species "
+
 	if not 	(model.getNumSpecies()):
 		return False
 	else:
@@ -436,7 +732,6 @@ def createSpecies(basePath,model,comptSbmlidMooseIdMap,specInfoMap):
 			spe = model.getSpecies(sindex)
 			sName = None
 			sId = spe.getId()
-
 			if spe.isSetName():
 				sName = spe.getName()
 				sName = sName.replace(" ","_space_")
@@ -453,7 +748,7 @@ def createSpecies(basePath,model,comptSbmlidMooseIdMap,specInfoMap):
 			hasonlySubUnit = spe.getHasOnlySubstanceUnits();
 			# "false": is {unit of amount}/{unit of size} (i.e., concentration or density). 
 			# "true": then the value is interpreted as having a unit of amount only.
-
+			
 			if (boundaryCondition):
 				poolId = moose.BufPool(comptEl+'/'+sName)
 			else:
@@ -461,7 +756,23 @@ def createSpecies(basePath,model,comptSbmlidMooseIdMap,specInfoMap):
 			
 			if (spe.isSetNotes):
 				pullnotes(spe,poolId)
-					
+			specAnnoInfo = {}
+			specAnnoInfo = getObjAnnotation(spe,modelAnnotaInfo)
+			if specAnnoInfo:
+				if not moose.exists(poolId.path+'/info'):
+					poolInfo = moose.Annotator(poolId.path+'/info')
+				else:
+					poolInfo = moose.element(poolId.path+'/info')
+				for k,v in list(specAnnoInfo.items()):
+					if k == 'xCord':
+						poolInfo.x = float(v)
+					elif k == 'yCord':
+						poolInfo.y = float(v)
+					elif k == 'bgColor':
+						poolInfo.color = v
+					else:
+						poolInfo.textColor = v
+
 			specInfoMap[sId] = {"Mpath" : poolId, "const" : constant, "bcondition" : boundaryCondition, "hassubunit" : hasonlySubUnit, "comptId" : comptSbmlidMooseIdMap[comptId]["MooseId"]}
 			initvalue = 0.0
 			unitfactor,unitset,unittype = transformUnit(spe,hasonlySubUnit)
@@ -478,7 +789,7 @@ def createSpecies(basePath,model,comptSbmlidMooseIdMap,specInfoMap):
 					initvalue = initvalue * unitfactor
 				elif spe.isSetInitialConcentration():
 					initvalue = spe.getInitialConcentration()
-					print " Since hasonlySubUnit is true and concentration is set units are not checked"
+					print(" Since hasonlySubUnit is true and concentration is set units are not checked")
 				poolId.nInit = initvalue
 
 			elif hasonlySubUnit == False:
@@ -508,8 +819,10 @@ def createSpecies(basePath,model,comptSbmlidMooseIdMap,specInfoMap):
 							found = True
 							break
 				if not (found):
-					print "Invalid SBML: Either initialConcentration or initialAmount must be set or it should be found in assignmentRule but non happening for ",sName
-					return False	
+					print("Invalid SBML: Either initialConcentration or initialAmount must be set or it should be found in assignmentRule but non happening for ",sName)
+					return False
+
+
 	return True
 
 def transformUnit(unitForObject,hasonlySubUnit=False):
@@ -533,7 +846,7 @@ def transformUnit(unitForObject,hasonlySubUnit=False):
 					lvalue *= pow( multiplier * pow(10.0,scale), exponent ) + offset;
 					unitset = True
 					unittype = "Litre"
-
+					return (lvalue,unitset,unittype)
 				elif( unitType.isMole()):
 					exponent = unitType.getExponent()
 					multiplier = unitType.getMultiplier()
@@ -544,12 +857,15 @@ def transformUnit(unitForObject,hasonlySubUnit=False):
 						lvalue *= pow(multiplier * pow(10.0,scale),exponent) + offset
 						#If SBML units are in mole then convert to number by multiplying with avogadro's number
 						lvalue = lvalue * pow(6.0221409e23,1)
-
 					elif hasonlySubUnit == False: 
-						#Pool units are in mM, so to scale adding +3 to convert to m
-						lvalue *= pow( multiplier * pow(10.0,scale+3), exponent ) + offset;
+						#Pool units in moose is mM
+						if scale > 0:
+							lvalue *= pow( multiplier * pow(10.0,scale-3), exponent ) + offset;
+						elif scale <= 0:
+							lvalue *= pow( multiplier * pow(10.0,scale+3), exponent ) + offset;
 					unitset = True
 					unittype = "Mole"
+					return (lvalue,unitset,unittype)
 		
 				elif( unitType.isItem()):
 					exponent = unitType.getExponent()
@@ -567,9 +883,9 @@ def transformUnit(unitForObject,hasonlySubUnit=False):
 						lvalue = lvalue/pow(6.0221409e23,1)
 					unitset = True
 					unittype = "Item"
+					return (lvalue,unitset,unittype)
 		else:
 			lvalue = 1.0
-		print " end of the func lvaue ",lvalue
 	return (lvalue,unitset,unittype)
 def createCompartment(basePath,model,comptSbmlidMooseIdMap):
 	#ToDoList : Check what should be done for the spaitialdimension is 2 or 1, area or length
@@ -597,14 +913,14 @@ def createCompartment(basePath,model,comptSbmlidMooseIdMap):
 			if ( compt.isSetSize() ):
 				msize = compt.getSize()
 				if msize == 1:
-					print "Compartment size is 1"
+					print("Compartment size is 1")
 
 			dimension = compt.getSpatialDimensions();
 			if dimension == 3:
 				unitfactor,unitset, unittype = transformUnit(compt)
 				
 			else:
-				print " Currently we don't deal with spatial Dimension less than 3 and unit's area or length"
+				print(" Currently we don't deal with spatial Dimension less than 3 and unit's area or length")
 				return False
 
 			if not( name ):
@@ -614,6 +930,42 @@ def createCompartment(basePath,model,comptSbmlidMooseIdMap):
 			mooseCmptId.volume = (msize*unitfactor)
 			comptSbmlidMooseIdMap[sbmlCmptId]={"MooseId": mooseCmptId, "spatialDim":dimension, "size" : msize}
 	return True
+def setupMMEnzymeReaction(reac,rName,specInfoMap,reactSBMLIdMooseId,modelAnnotaInfo,model,globparameterIdValue):
+	msg = ""
+	errorFlag = ""
+	numRcts = reac.getNumReactants()
+	numPdts = reac.getNumProducts()
+	nummodifiers = reac.getNumModifiers()
+	if (nummodifiers):
+		parent = reac.getModifier(0)
+		parentSp = parent.getSpecies()
+		parentSp = str(idBeginWith(parentSp))
+		enzParent = specInfoMap[parentSp]["Mpath"]
+		MMEnz = moose.MMenz(enzParent.path+'/'+rName)
+		moose.connect(enzParent,"nOut",MMEnz,"enzDest");
+		reactionCreated = True
+		reactSBMLIdMooseId[rName] = { "MooseId":MMEnz, "className":"MMEnz"}
+		if reactionCreated:
+			if (reac.isSetNotes):
+				pullnotes(reac,MMEnz)
+				reacAnnoInfo = {}
+				reacAnnoInfo = getObjAnnotation(reac,modelAnnotaInfo)
+				if reacAnnoInfo:
+					if not moose.exists(MMEnz.path+'/info'):
+						reacInfo = moose.Annotator(MMEnz.path+'/info')
+					else:
+						reacInfo = moose.element(MMEnz.path+'/info')
+					for k,v in list(reacAnnoInfo.items()):
+						if k == 'xCord':
+							reacInfo.x = float(v)
+						elif k == 'yCord':
+							reacInfo.y = float(v)
+						elif k == 'bgColor':
+							reacInfo.color = v
+						else:
+							reacInfo.textColor = v
+			return(reactionCreated,MMEnz)
+
 def mapParameter(model,globparameterIdValue):
 	for pm in range(0,model.getNumParameters()):
 		prm = model.getParameter( pm );
@@ -624,6 +976,29 @@ def mapParameter(model,globparameterIdValue):
 			value = prm.getValue()
 		globparameterIdValue[parid] = value
 
+def idBeginWith( name ):
+	changedName = name;
+	if name[0].isdigit() :
+		changedName = "_"+name
+	return changedName;
+
+def convertSpecialChar(str1):
+	d = {"&":"_and","<":"_lessthan_",">":"_greaterthan_","BEL":"&#176","-":"_minus_","'":"_prime_",
+		 "+": "_plus_","*":"_star_","/":"_slash_","(":"_bo_",")":"_bc_",
+		 "[":"_sbo_","]":"_sbc_",".":"_dot_"," ":"_"
+		}
+	for i,j in list(d.items()):
+		str1 = str1.replace(i,j)
+	return str1
+
+def mooseIsInstance(element, classNames):
+	return moose.element(element).__class__.__name__ in classNames
+
+def findCompartment(element):
+	while not mooseIsInstance(element,["CubeMesh","CyclMesh"]):
+		element = element.parent
+	return element
+
 if __name__ == "__main__":
 	
 	filepath = sys.argv[1]
@@ -638,6 +1013,6 @@ if __name__ == "__main__":
 	
 	read = mooseReadSBML(filepath,loadpath)
 	if read:
-		print " Read to path",loadpath
+		print(" Read to path",loadpath)
 	else:
-		print " could not read  SBML to MOOSE"
+		print(" could not read  SBML to MOOSE")
diff --git a/python/moose/SBML/writeSBML.py b/python/moose/SBML/writeSBML.py
index b810c7bb..988b8c47 100644
--- a/python/moose/SBML/writeSBML.py
+++ b/python/moose/SBML/writeSBML.py
@@ -12,115 +12,160 @@
 **           copyright (C) 2003-2016 Upinder S. Bhalla. and NCBS
 Created : Friday May 27 12:19:00 2016(+0530)
 Version 
-Last-Updated:
+Last-Updated: Thursday Oct 27 11:20:00 2016(+0530)
 		  By:
 **********************************************************************/
 /****************************
 
 '''
 from moose import *
-from libsbml import *
 import re
 from collections import Counter
-#from moose import wildcardFind, element, loadModel, ChemCompt, exists, Annotator, Pool, ZombiePool,PoolBase,CplxEnzBase,Function,ZombieFunction
+import networkx as nx
+import matplotlib.pyplot as plt
+import sys
 
 #ToDo:
 #	Table should be written
 #	Group's should be added
-#   x and y cordinates shd be added if exist
+#	boundary condition for buffer pool having assignment statment constant shd be false
 
-def mooseWriteSBML(modelpath,filename):
-	sbmlDoc = SBMLDocument(3, 1)
-	filepath,filenameExt = os.path.split(filename)
-	if filenameExt.find('.') != -1:
-		filename = filenameExt[:filenameExt.find('.')]
-	else:
-		filename = filenameExt
-	
-	#validatemodel
-	sbmlOk = False
-	global spe_constTrue
-	spe_constTrue = []
-	global nameList_
-	nameList_ = []
-
-	xmlns = XMLNamespaces()
-	xmlns.add("http://www.sbml.org/sbml/level3/version1")
-	xmlns.add("http://www.moose.ncbs.res.in","moose")
-	xmlns.add("http://www.w3.org/1999/xhtml","xhtml")
-	sbmlDoc.setNamespaces(xmlns)
-	cremodel_ = sbmlDoc.createModel()
-	cremodel_.setId(filename)
-	cremodel_.setTimeUnits("second")
-	cremodel_.setExtentUnits("substance")
-	cremodel_.setSubstanceUnits("substance")
-	
-	writeUnits(cremodel_)
-	modelAnno = writeSimulationAnnotation(modelpath)
-	if modelAnno:
-		cremodel_.setAnnotation(modelAnno)
-	compartexist = writeCompt(modelpath,cremodel_)
-	species = writeSpecies(modelpath,cremodel_,sbmlDoc)
-	if species:
-		writeFunc(modelpath,cremodel_)
-	writeReac(modelpath,cremodel_)
-	writeEnz(modelpath,cremodel_)
-
-	consistencyMessages = ""
-	SBMLok = validateModel( sbmlDoc )
-	if ( SBMLok ):
-		#filepath = '/home/harsha/Trash/python'
-		#SBMLString = writeSBMLToString(sbmlDoc)
-		writeTofile = filepath+"/"+filename+'.xml'
-		writeSBMLToFile( sbmlDoc, writeTofile)
-		return True,consistencyMessages
-
-	if ( not SBMLok ):
-		cerr << "Errors encountered " << endl;
-		return -1,consistencyMessages
-
-def writeEnz(modelpath,cremodel_):
+try: 
+    from libsbml import *
+except ImportError: 
+    def mooseWriteSBML(modelpath,filename,sceneitems={}):
+    	return (-2,"\n WriteSBML : python-libsbml module not installed",None)
+else:
+	def mooseWriteSBML(modelpath,filename,sceneitems={}):
+			sbmlDoc = SBMLDocument(3, 1)
+			filepath,filenameExt = os.path.split(filename)
+			if filenameExt.find('.') != -1:
+				filename = filenameExt[:filenameExt.find('.')]
+			else:
+				filename = filenameExt
+			
+			#validatemodel
+			sbmlOk = False
+			global spe_constTrue,cmin,cmax
+			spe_constTrue = []
+			global nameList_
+			nameList_ = []
+
+			autoCoordinateslayout = False
+			xmlns = XMLNamespaces()
+			xmlns.add("http://www.sbml.org/sbml/level3/version1")
+			xmlns.add("http://www.moose.ncbs.res.in","moose")
+			xmlns.add("http://www.w3.org/1999/xhtml","xhtml")
+			sbmlDoc.setNamespaces(xmlns)
+			cremodel_ = sbmlDoc.createModel()
+			cremodel_.setId(filename)
+			cremodel_.setTimeUnits("second")
+			cremodel_.setExtentUnits("substance")
+			cremodel_.setSubstanceUnits("substance")
+			neutralNotes = ""
+			specieslist = wildcardFind(modelpath+'/##[ISA=PoolBase]')
+			neutralPath = getGroupinfo(specieslist[0])
+			if moose.exists(neutralPath.path+'/info'):
+				neutralInfo = moose.element(neutralPath.path+'/info')
+				neutralNotes = neutralInfo.notes
+			if neutralNotes != "":
+				cleanNotes= convertNotesSpecialChar(neutralNotes)
+				notesString = "<body xmlns=\"http://www.w3.org/1999/xhtml\">\n \t \t"+ neutralNotes + "\n\t </body>"
+				cremodel_.setNotes(notesString)
+			srcdesConnection = {}
+			cmin,cmax = 0,1
+
+			if not bool(sceneitems):
+				autoCoordinateslayout = True
+				srcdesConnection = setupItem(modelpath)
+				meshEntry = setupMeshObj(modelpath)
+				cmin,cmax,sceneitems = autoCoordinates(meshEntry,srcdesConnection)
+			
+			writeUnits(cremodel_)
+			modelAnno = writeSimulationAnnotation(modelpath)
+			if modelAnno:
+				cremodel_.setAnnotation(modelAnno)
+			compartexist = writeCompt(modelpath,cremodel_)
+			species = writeSpecies(modelpath,cremodel_,sbmlDoc,sceneitems,autoCoordinateslayout)
+			if species:
+				writeFunc(modelpath,cremodel_)
+			writeReac(modelpath,cremodel_,sceneitems,autoCoordinateslayout)
+			writeEnz(modelpath,cremodel_,sceneitems,autoCoordinateslayout)
+
+			consistencyMessages = ""
+			SBMLok = validateModel( sbmlDoc )
+			if ( SBMLok ):
+				writeTofile = filepath+"/"+filename+'.xml'
+				writeSBMLToFile( sbmlDoc, writeTofile)
+				return True,consistencyMessages,writeTofile
+
+			if ( not SBMLok ):
+				cerr << "Errors encountered " << endl;
+				return -1,consistencyMessages
+
+def writeEnz(modelpath,cremodel_,sceneitems,autoCoordinateslayout):
 	for enz in wildcardFind(modelpath+'/##[ISA=EnzBase]'):
 		enzannoexist = False
-		enzGpname = " "
+		enzGpnCorCol = " "
 		cleanEnzname = convertSpecialChar(enz.name) 
 		enzSubt = ()        
-
+		compt = ""
+		notesE = ""
 		if moose.exists(enz.path+'/info'):
 			Anno = moose.Annotator(enz.path+'/info')
 			notesE = Anno.notes
 			element = moose.element(enz)
 			ele = getGroupinfo(element)
-			if ele.className == "Neutral":
-				enzGpname = "<moose:Group> "+ ele.name + " </moose:Group>\n"
+			if ele.className == "Neutral" or sceneitems[element] or Anno.color or Anno.textColor:
 				enzannoexist = True
 
+			if enzannoexist :
+				if ele.className == "Neutral":
+					enzGpnCorCol = "<moose:Group> "+ ele.name + " </moose:Group>\n"
+				if sceneitems[element]:
+					v = sceneitems[element]
+					if autoCoordinateslayout == False:
+						enzGpnCorCol = enzGpnCorCol+"<moose:xCord>"+str(v['x'])+"</moose:xCord>\n"+"<moose:yCord>"+str(v['y'])+"</moose:yCord>\n"
+					elif autoCoordinateslayout == True:
+						x = calPrime(v['x'])
+						y = calPrime(v['y'])
+						enzGpnCorCol = enzGpnCorCol+"<moose:xCord>"+str(x)+"</moose:xCord>\n"+"<moose:yCord>"+str(y)+"</moose:yCord>\n"
+				if Anno.color:
+					enzGpnCorCol = enzGpnCorCol+"<moose:bgColor>"+Anno.color+"</moose:bgColor>\n"
+				if Anno.textColor:
+					enzGpnCorCol = enzGpnCorCol+"<moose:textColor>"+Anno.textColor+"</moose:textColor>\n"
+				
 		if (enz.className == "Enz" or enz.className == "ZombieEnz"):
 			enzyme = cremodel_.createReaction()
 			if notesE != "":
 				cleanNotesE= convertNotesSpecialChar(notesE)
 				notesStringE = "<body xmlns=\"http://www.w3.org/1999/xhtml\">\n \t \t"+ cleanNotesE + "\n\t </body>"
 				enzyme.setNotes(notesStringE)
+			comptVec = findCompartment(moose.element(enz))
+			if not isinstance(moose.element(comptVec),moose.ChemCompt):
+				return -2
+			else:
+				compt = comptVec.name+"_"+str(comptVec.getId().value)+"_"+str(comptVec.getDataIndex())+"_"
+			
 			enzyme.setId(str(idBeginWith(cleanEnzname+"_"+str(enz.getId().value)+"_"+str(enz.getDataIndex())+"_"+"Complex_formation_")))
 			enzyme.setName(cleanEnzname)
 			enzyme.setFast ( False )
 			enzyme.setReversible( True)
-			k1 = enz.k1
+			k1 = enz.concK1
 			k2 = enz.k2
 			k3 = enz.k3
-
+			enzAnno = " "
 			enzAnno ="<moose:EnzymaticReaction>\n"
-			if enzannoexist:
-				enzAnno=enzAnno + enzGpname
+			
 			enzOut = enz.neighbors["enzOut"]
 			
 			if not enzOut:
-				print " Enzyme parent missing for ",enz.name
+				print(" Enzyme parent missing for ",enz.name)
 			else:
 				listofname(enzOut,True)
 				enzSubt = enzOut
 				for i in range(0,len(nameList_)):
-					enzAnno=enzAnno+"<moose:enzyme>"+nameList_[i]+"</moose:enzyme>\n"
+					enzAnno=enzAnno+"<moose:enzyme>"+(str(idBeginWith(convertSpecialChar(nameList_[i]))))+"</moose:enzyme>\n"
 			#noofSub,sRateLaw = getSubprd(cremodel_,True,"sub",enzSub)
 			#for i in range(0,len(nameList_)):
 			#    enzAnno=enzAnno+"<moose:enzyme>"+nameList_[i]+"</moose:enzyme>\n"
@@ -129,7 +174,7 @@ def writeEnz(modelpath,cremodel_):
 			
 			enzSub = enz.neighbors["sub"]
 			if not enzSub:
-				print "Enzyme \"",enz.name,"\" substrate missing"
+				print("Enzyme \"",enz.name,"\" substrate missing")
 			else:
 				listofname(enzSub,True)
 				enzSubt += enzSub
@@ -140,26 +185,27 @@ def writeEnz(modelpath,cremodel_):
 				noofSub,sRateLaw = getSubprd(cremodel_,True,"sub",enzSubt)
 				#rec_order = rec_order + noofSub
 				rec_order = noofSub
-				rate_law = rate_law +"*"+sRateLaw
-				
-		   
+				rate_law = compt+" * "+rate_law +"*"+sRateLaw
 
 			enzPrd = enz.neighbors["cplxDest"]
 			if not enzPrd:
-				print "Enzyme \"",enz.name,"\"product missing"
+				print("Enzyme \"",enz.name,"\"product missing")
 			else:
 				noofPrd,sRateLaw = getSubprd(cremodel_,True,"prd",enzPrd)
 				for i in range(0,len(nameList_)):
 					enzAnno= enzAnno+"<moose:product>"+nameList_[i]+"</moose:product>\n"
-				rate_law = rate_law+ " - "+"k2"+'*'+sRateLaw 
+				rate_law = rate_law+ " - "+compt+"* k2"+'*'+sRateLaw 
 			
 			prd_order = noofPrd
 			enzAnno = enzAnno + "<moose:groupName>" + cleanEnzname + "_" + str(enz.getId().value) + "_" + str(enz.getDataIndex()) + "_" + "</moose:groupName>\n"
 			enzAnno = enzAnno+"<moose:stage>1</moose:stage>\n"
+			if enzannoexist:
+				enzAnno=enzAnno + enzGpnCorCol
 			enzAnno = enzAnno+ "</moose:EnzymaticReaction>"
 			enzyme.setAnnotation(enzAnno)
 			kl = enzyme.createKineticLaw()
 			kl.setFormula( rate_law )
+			kl.setNotes("<body xmlns=\"http://www.w3.org/1999/xhtml\">\n\t\t" + rate_law + "\n \t </body>")
 			punit = parmUnit( prd_order-1, cremodel_ )
 			printParameters( kl,"k2",k2,punit ) 
 			
@@ -174,7 +220,7 @@ def writeEnz(modelpath,cremodel_):
 			
 			enzSub = enz.neighbors["cplxDest"]
 			if not enzSub:
-				print " complex missing from ",enz.name
+				print(" complex missing from ",enz.name)
 			else:
 				noofSub,sRateLaw = getSubprd(cremodel_,True,"sub",enzSub)
 				for i in range(0,len(nameList_)):
@@ -182,7 +228,7 @@ def writeEnz(modelpath,cremodel_):
 
 			enzEnz = enz.neighbors["enzOut"]
 			if not enzEnz:
-				print "Enzyme parent missing for ",enz.name
+				print("Enzyme parent missing for ",enz.name)
 			else:
 				noofEnz,sRateLaw1 = getSubprd(cremodel_,True,"prd",enzEnz)
 				for i in range(0,len(nameList_)):
@@ -191,51 +237,63 @@ def writeEnz(modelpath,cremodel_):
 			if enzPrd:
 				noofprd,sRateLaw2 = getSubprd(cremodel_,True,"prd",enzPrd)
 			else:
-				print "Enzyme \"",enz.name, "\" product missing" 
+				print("Enzyme \"",enz.name, "\" product missing") 
 			for i in range(0,len(nameList_)):
 				enzAnno2 = enzAnno2+"<moose:product>"+nameList_[i]+"</moose:product>\n"
 			enzAnno2 += "<moose:groupName>"+ cleanEnzname + "_" + str(enz.getId().value) + "_" + str(enz.getDataIndex())+"_" +"</moose:groupName>\n";
 			enzAnno2 += "<moose:stage>2</moose:stage> \n";
+			if enzannoexist:
+				enzAnno2 = enzAnno2 + enzGpnCorCol
 			enzAnno2 += "</moose:EnzymaticReaction>";
 			enzyme.setAnnotation( enzAnno2 );
 
-			enzrate_law = "k3" + '*'+sRateLaw;
+			enzrate_law = compt +" * k3" + '*'+sRateLaw;
 			kl = enzyme.createKineticLaw();
 			kl.setFormula( enzrate_law );
+			kl.setNotes("<body xmlns=\"http://www.w3.org/1999/xhtml\">\n\t\t" + enzrate_law + "\n \t </body>")
 			unit = parmUnit(noofPrd-1 ,cremodel_)
 			printParameters( kl,"k3",k3,unit ); 
 			
 		elif(enz.className == "MMenz" or enz.className == "ZombieMMenz"):
-			enzyme = cremodel_.createReaction()
-			
-			if notesE != "":
-				cleanNotesE= convertNotesSpecialChar(notesE)
-				notesStringE = "<body xmlns=\"http://www.w3.org/1999/xhtml\">\n \t \t"+ cleanNotesE + "\n\t </body>"
-				enzyme.setNotes(notesStringE)
-			enzyme.setId(str(idBeginWith(cleanEnzname+"_"+str(enz.getId().value)+"_"+str(enz.getDataIndex())+"_")))
-			enzyme.setName(cleanEnzname)
-			enzyme.setFast ( False )
-			enzyme.setReversible( True)
-			if enzannoexist:
-				enzAnno = "<moose:EnzymaticReaction>\n" + enzGpname + "</moose:EnzymaticReaction>";
-				enzyme.setAnnotation(enzAnno)
-			Km = enz.numKm
-			kcat = enz.kcat
 			enzSub = enz.neighbors["sub"] 
-			noofSub,sRateLawS = getSubprd(cremodel_,False,"sub",enzSub)
-			#sRate_law << rate_law.str();
-			#Modifier
-			enzMod = enz.neighbors["enzDest"]
-			noofMod,sRateLawM = getSubprd(cremodel_,False,"enz",enzMod)
-			enzPrd = enz.neighbors["prd"]
-			noofPrd,sRateLawP = getSubprd(cremodel_,False,"prd",enzPrd)
-			kl = enzyme.createKineticLaw()
-			fRate_law ="kcat *" + sRateLawS + "*" + sRateLawM + "/" + "(" + "Km" + "+" +sRateLawS +")"
-			kl.setFormula(fRate_law)
-			kl.setNotes("<body xmlns=\"http://www.w3.org/1999/xhtml\">\n\t\t" + fRate_law + "\n \t </body>")
-			printParameters( kl,"Km",Km,"substance" )
-			kcatUnit = parmUnit( 0,cremodel_ )
-			printParameters( kl,"kcat",kcat,kcatUnit )
+			enzPrd = enz.neighbors["prd"] 
+			if (len(enzSub) != 0 and len(enzPrd) != 0 ):
+				enzCompt= findCompartment(enz)
+				if not isinstance(moose.element(enzCompt),moose.ChemCompt):
+					return -2
+				else:
+					compt = enzCompt.name+"_"+str(enzCompt.getId().value)+"_"+str(enzCompt.getDataIndex())+"_"
+				enzyme = cremodel_.createReaction()
+				enzAnno = " "
+				if notesE != "":
+					cleanNotesE= convertNotesSpecialChar(notesE)
+					notesStringE = "<body xmlns=\"http://www.w3.org/1999/xhtml\">\n \t \t"+ cleanNotesE + "\n\t </body>"
+					enzyme.setNotes(notesStringE)
+				enzyme.setId(str(idBeginWith(cleanEnzname+"_"+str(enz.getId().value)+"_"+str(enz.getDataIndex())+"_")))
+				enzyme.setName(cleanEnzname)
+				enzyme.setFast ( False )
+				enzyme.setReversible( True)
+				if enzannoexist:
+					enzAnno = enzAnno + enzGpnCorCol
+					enzAnno = "<moose:EnzymaticReaction>\n" + enzGpnCorCol + "</moose:EnzymaticReaction>";
+					enzyme.setAnnotation(enzAnno)
+				Km = enz.Km
+				kcat = enz.kcat
+				enzSub = enz.neighbors["sub"] 
+				noofSub,sRateLawS = getSubprd(cremodel_,False,"sub",enzSub)
+				#sRate_law << rate_law.str();
+				#Modifier
+				enzMod = enz.neighbors["enzDest"]
+				noofMod,sRateLawM = getSubprd(cremodel_,False,"enz",enzMod)
+				enzPrd = enz.neighbors["prd"]
+				noofPrd,sRateLawP = getSubprd(cremodel_,False,"prd",enzPrd)
+				kl = enzyme.createKineticLaw()
+				fRate_law = "kcat *" + sRateLawS + "*" + sRateLawM + "/(" + compt +" * ("+ "Km" + "+" +sRateLawS +"))"
+				kl.setFormula(fRate_law)
+				kl.setNotes("<body xmlns=\"http://www.w3.org/1999/xhtml\">\n\t\t" + fRate_law + "\n \t </body>")
+				printParameters( kl,"Km",Km,"substance" )
+				kcatUnit = parmUnit( 0,cremodel_ )
+				printParameters( kl,"kcat",kcat,kcatUnit )
 
 def printParameters( kl, k, kvalue, unit ):
 	para = kl.createParameter()
@@ -248,11 +306,11 @@ def parmUnit( rct_order,cremodel_ ):
 	if order == 0:
 		unit_stream = "per_second"
 	elif order == 1:
-		unit_stream = "per_item_per_second"
+		unit_stream = "litre_per_mmole_per_second"
 	elif order == 2:
-		unit_stream ="per_item_sq_per_second"
+		unit_stream ="litre_per_mmole_sq_per_second"
 	else:
-		unit_stream = "per_item_"+str(rct_order)+"_per_second";
+		unit_stream = "litre_per_mmole_"+str(rct_order)+"_per_second";
 
 	lud =cremodel_.getListOfUnitDefinitions();
 	flag = False;
@@ -267,10 +325,15 @@ def parmUnit( rct_order,cremodel_ ):
 		#Create individual unit objects that will be put inside the UnitDefinition .
 		if order != 0 :
 			unit = unitdef.createUnit()
-			unit.setKind( UNIT_KIND_ITEM )
-			unit.setExponent( -order )
+			unit.setKind( UNIT_KIND_LITRE )
+			unit.setExponent( 1 )
 			unit.setMultiplier(1)
 			unit.setScale( 0 )
+			unit = unitdef.createUnit()
+			unit.setKind( UNIT_KIND_MOLE )
+			unit.setExponent( -order )
+			unit.setMultiplier(1)
+			unit.setScale( -3 )
 
 		unit = unitdef.createUnit();
 		unit.setKind( UNIT_KIND_SECOND );
@@ -291,7 +354,6 @@ def getSubprd(cremodel_,mobjEnz,type,neighborslist):
 			rate_law = processRateLaw(reacSubCou,cremodel_,noofSub,"sub",mobjEnz)
 			return len(reacSub),rate_law
 		else:
-			print reac.className+ " has no substrate"
 			return 0,rate_law
 	elif type == "prd":
 		reacPrd = neighborslist
@@ -314,7 +376,7 @@ def getSubprd(cremodel_,mobjEnz,type,neighborslist):
 def processRateLaw(objectCount,cremodel,noofObj,type,mobjEnz):
 	rate_law = ""
 	nameList_[:] = []
-	for value,count in objectCount.iteritems():
+	for value,count in objectCount.items():
 		value = moose.element(value)
 		nameIndex = value.name+"_"+str(value.getId().value)+"_"+str(value.getDataIndex())+"_"
 		clean_name = (str(idBeginWith(convertSpecialChar(nameIndex))))
@@ -351,108 +413,141 @@ def processRateLaw(objectCount,cremodel,noofObj,type,mobjEnz):
 def listofname(reacSub,mobjEnz):
 	objectCount = Counter(reacSub)
 	nameList_[:] = []
-	for value,count in objectCount.iteritems():
+	for value,count in objectCount.items():
 		value = moose.element(value)
 		nameIndex = value.name+"_"+str(value.getId().value)+"_"+str(value.getDataIndex())+"_"
 		clean_name = convertSpecialChar(nameIndex)
 		if mobjEnz == True:
 			nameList_.append(clean_name)
-def writeReac(modelpath,cremodel_):
+
+def writeReac(modelpath,cremodel_,sceneitems,autoCoordinateslayout):
 	for reac in wildcardFind(modelpath+'/##[ISA=ReacBase]'):
-		reaction = cremodel_.createReaction()
-		reacannoexist = False
-		reacGpname = " "
-		cleanReacname = convertSpecialChar(reac.name) 
-		reaction.setId(str(idBeginWith(cleanReacname+"_"+str(reac.getId().value)+"_"+str(reac.getDataIndex())+"_")))
-		reaction.setName(cleanReacname)
-		Kf = reac.numKf
-		Kb = reac.numKb
-		if Kb == 0.0:
-			reaction.setReversible( False )
-		else:
-			reaction.setReversible( True )
-		
-		reaction.setFast( False )
-		if moose.exists(reac.path+'/info'):
-			Anno = moose.Annotator(reac.path+'/info')
-			notesR = Anno.notes
-			if notesR != "":
-				cleanNotesR= convertNotesSpecialChar(notesR)
-				notesStringR = "<body xmlns=\"http://www.w3.org/1999/xhtml\">\n \t \t"+ cleanNotesR + "\n\t </body>"
-				reaction.setNotes(notesStringR)
-			element = moose.element(reac)
-			ele = getGroupinfo(element)
-			if ele.className == "Neutral":
-				reacGpname = "<moose:Group>"+ ele.name + "</moose:Group>\n"
-				reacannoexist = True
-			if reacannoexist :
-				reacAnno = "<moose:ModelAnnotation>\n"
-				if reacGpname:
-					reacAnno = reacAnno + reacGpname
-				reacAnno = reacAnno+ "</moose:ModelAnnotation>"
-				#s1.appendAnnotation(XMLNode.convertStringToXMLNode(speciAnno))
-				reaction.setAnnotation(reacAnno)
-		
-		kl_s = sRL = pRL = ""
-		
 		reacSub = reac.neighbors["sub"]
 		reacPrd = reac.neighbors["prd"]
-		if not reacSub and not reacPrd:
-			print " Reaction ",reac.name, "missing substrate and product"
-		else:
-			kfl = reaction.createKineticLaw()
-			if reacSub:
-				noofSub,sRateLaw = getSubprd(cremodel_,False,"sub",reacSub)
-				if noofSub:
-					cleanReacname = cleanReacname+"_"+str(reac.getId().value)+"_"+str(reac.getDataIndex())+"_"
-					kfparm = idBeginWith(cleanReacname)+"_"+"Kf"
-					sRL = idBeginWith(cleanReacname) + "_Kf * " + sRateLaw
-					unit = parmUnit( noofSub-1 ,cremodel_)
-					printParameters( kfl,kfparm,Kf,unit ); 
-					kl_s = sRL
-				else:
-					print reac.name + " has no substrate"
-					return -2
+		if (len(reacSub) != 0  and len(reacPrd) != 0 ):
+
+			reaction = cremodel_.createReaction()
+			reacannoexist = False
+			reacGpname = " "
+			cleanReacname = convertSpecialChar(reac.name) 
+			reaction.setId(str(idBeginWith(cleanReacname+"_"+str(reac.getId().value)+"_"+str(reac.getDataIndex())+"_")))
+			reaction.setName(cleanReacname)
+			#Kf = reac.numKf
+			#Kb = reac.numKb
+			Kf = reac.Kf
+			Kb = reac.Kb
+			if Kb == 0.0:
+				reaction.setReversible( False )
 			else:
-				print " Substrate missing for reaction ",reac.name
-				
-			if reacPrd:
-				noofPrd,pRateLaw = getSubprd(cremodel_,False,"prd",reacPrd)
-				if  noofPrd:
-					if Kb:
-						kbparm = idBeginWith(cleanReacname)+"_"+"Kb"
-						pRL = idBeginWith(cleanReacname) + "_Kb * " + pRateLaw
-						unit = parmUnit( noofPrd-1 , cremodel_)
-						printParameters( kfl,kbparm,Kb,unit );
-						kl_s = kl_s+ "- "+pRL
-				else:
-					print reac.name + " has no product"
-					return -2
+				reaction.setReversible( True )
+			
+			reaction.setFast( False )
+			if moose.exists(reac.path+'/info'):
+				Anno = moose.Annotator(reac.path+'/info')
+				if Anno.notes != "":
+					cleanNotesR = convertNotesSpecialChar(Anno.notes)
+					notesStringR = "<body xmlns=\"http://www.w3.org/1999/xhtml\">\n \t \t"+ cleanNotesR + "\n\t </body>"
+					reaction.setNotes(notesStringR)
+				element = moose.element(reac)
+				ele = getGroupinfo(element)
+				if ele.className == "Neutral" or sceneitems[element] or Anno.color or Anno.textColor:
+					reacannoexist = True
+				if reacannoexist :
+					reacAnno = "<moose:ModelAnnotation>\n"
+					if ele.className == "Neutral":
+						reacAnno = reacAnno + "<moose:Group>"+ ele.name + "</moose:Group>\n"
+					if sceneitems[element]:
+						v = sceneitems[element]
+						if autoCoordinateslayout == False:
+							reacAnno = reacAnno+"<moose:xCord>"+str(v['x'])+"</moose:xCord>\n"+"<moose:yCord>"+str(v['y'])+"</moose:yCord>\n"
+						elif autoCoordinateslayout == True:
+							x = calPrime(v['x'])
+							y = calPrime(v['y'])
+							reacAnno = reacAnno+"<moose:xCord>"+str(x)+"</moose:xCord>\n"+"<moose:yCord>"+str(y)+"</moose:yCord>\n"
+					if Anno.color:
+						reacAnno = reacAnno+"<moose:bgColor>"+Anno.color+"</moose:bgColor>\n"
+					if Anno.textColor:
+						reacAnno = reacAnno+"<moose:textColor>"+Anno.textColor+"</moose:textColor>\n"
+					reacAnno = reacAnno+ "</moose:ModelAnnotation>"
+					#s1.appendAnnotation(XMLNode.convertStringToXMLNode(speciAnno))
+					reaction.setAnnotation(reacAnno)
+			kl_s , sRL, pRL ,compt= "", "", "",""
+		
+			if not reacSub and not reacPrd:
+				print(" Reaction ",reac.name, "missing substrate and product")
 			else:
-				print " Product missing for reaction ",reac.name
-		kfl.setFormula(kl_s)
+				kfl = reaction.createKineticLaw()
+				if reacSub:
+					noofSub,sRateLaw = getSubprd(cremodel_,False,"sub",reacSub)
+					if noofSub:
+						comptVec = findCompartment(moose.element(reacSub[0]))
+						if not isinstance(moose.element(comptVec),moose.ChemCompt):
+							return -2
+						else:
+							compt = comptVec.name+"_"+str(comptVec.getId().value)+"_"+str(comptVec.getDataIndex())+"_"
+						cleanReacname = cleanReacname+"_"+str(reac.getId().value)+"_"+str(reac.getDataIndex())+"_"
+						kfparm = idBeginWith(cleanReacname)+"_"+"Kf"
+						sRL = compt +" * " +idBeginWith(cleanReacname) + "_Kf * " + sRateLaw
+						unit = parmUnit( noofSub-1 ,cremodel_)
+						printParameters( kfl,kfparm,Kf,unit ); 
+						#kl_s = compt+"(" +sRL
+						kl_s = sRL
+					else:
+						print(reac.name + " has no substrate")
+						return -2
+				else:
+					print(" Substrate missing for reaction ",reac.name)
+					
+				if reacPrd:
+					noofPrd,pRateLaw = getSubprd(cremodel_,False,"prd",reacPrd)
+					if  noofPrd:
+						if Kb:
+							kbparm = idBeginWith(cleanReacname)+"_"+"Kb"
+							pRL = compt +" * " +idBeginWith(cleanReacname) + "_Kb * " + pRateLaw
+							unit = parmUnit( noofPrd-1 , cremodel_)
+							printParameters( kfl,kbparm,Kb,unit );
+							#kl_s = kl_s+ "- "+pRL+")"
+							kl_s = kl_s + "-"+pRL
+					else:
+						print(reac.name + " has no product")
+						return -2
+				else:
+					print(" Product missing for reaction ",reac.name)
+			kfl.setFormula(kl_s)
+			kfl.setNotes("<body xmlns=\"http://www.w3.org/1999/xhtml\">\n\t\t" + kl_s + "\n \t </body>")
+			kfl.setFormula(kl_s)
+		else:
+			print(" Reaction ",reac.name, "missing substrate and product")
 
 def writeFunc(modelpath,cremodel_):
 	funcs = wildcardFind(modelpath+'/##[ISA=Function]')
 	#if func:
 	for func in funcs:
 		if func:
-			fName = idBeginWith( convertSpecialChar(func.parent.name+"_"+str(func.parent.getId().value)+"_"+str(func.parent.getDataIndex())+"_"))
-			item = func.path+'/x[0]'
-			sumtot = moose.element(item).neighbors["input"]
-			expr = moose.element(func).expr
-			for i in range(0,len(sumtot)):
-				v ="x"+str(i)
-				if v in expr:
-					z = str(convertSpecialChar(sumtot[i].name+"_"+str(moose.element(sumtot[i]).getId().value)+"_"+str(moose.element(sumtot[i]).getDataIndex()))+"_")
-					expr = expr.replace(v,z)
+			if func.parent.className == "CubeMesh" or func.parent.className == "CyclMesh":
+				funcEle = moose.element(moose.element(func).neighbors["valueOut"][0])
+				funcEle1 = moose.element(funcEle)
+				fName = idBeginWith(convertSpecialChar(funcEle.name+"_"+str(funcEle.getId().value)+"_"+str(funcEle.getDataIndex())+"_"))
+				expr = moose.element(func).expr
+
+			else:
+				fName = idBeginWith( convertSpecialChar(func.parent.name+"_"+str(func.parent.getId().value)+"_"+str(func.parent.getDataIndex())+"_"))
+				item = func.path+'/x[0]'
+				sumtot = moose.element(item).neighbors["input"]
+				expr = moose.element(func).expr
+				for i in range(0,len(sumtot)):
+					v ="x"+str(i)
+					if v in expr:
+						z = str(idBeginWith(str(convertSpecialChar(sumtot[i].name+"_"+str(moose.element(sumtot[i]).getId().value)+"_"+str(moose.element(sumtot[i]).getDataIndex()))+"_")))
+						expr = expr.replace(v,z)
+				
 			rule =  cremodel_.createAssignmentRule()
 			rule.setVariable( fName )
 			rule.setFormula( expr )
 			
 def convertNotesSpecialChar(str1):
 	d = {"&":"_and","<":"_lessthan_",">":"_greaterthan_","BEL":"&#176"}
-	for i,j in d.iteritems():
+	for i,j in d.items():
 		str1 = str1.replace(i,j)
 	#stripping \t \n \r and space from begining and end of string
 	str1 = str1.strip(' \t\n\r')
@@ -488,12 +583,12 @@ def convertSpecialChar(str1):
 		 "+": "_plus_","*":"_star_","/":"_slash_","(":"_bo_",")":"_bc_",
 		 "[":"_sbo_","]":"_sbc_",".":"_dot_"," ":"_"
 		}
-	for i,j in d.iteritems():
+	for i,j in d.items():
 		str1 = str1.replace(i,j)
 	return str1
 	
-def writeSpecies(modelpath,cremodel_,sbmlDoc):
-	#getting all the species 
+def writeSpecies(modelpath,cremodel_,sbmlDoc,sceneitems,autoCoordinateslayout):
+	#getting all the species
 	for spe in wildcardFind(modelpath+'/##[ISA=PoolBase]'):
 		sName = convertSpecialChar(spe.name)
 		comptVec = findCompartment(spe)
@@ -519,7 +614,8 @@ def writeSpecies(modelpath,cremodel_,sbmlDoc):
 
 			
 			s1.setName(sName)
-			s1.setInitialAmount(spe.nInit)
+			#s1.setInitialAmount(spe.nInit)
+			s1.setInitialConcentration(spe.concInit)
 			s1.setCompartment(compt)
 			#  Setting BoundaryCondition and constant as per this rule for BufPool
 			#  -constanst  -boundaryCondition  -has assignment/rate Rule  -can be part of sub/prd
@@ -547,25 +643,36 @@ def writeSpecies(modelpath,cremodel_,sbmlDoc):
 				s1.setBoundaryCondition(False)
 				s1.setConstant(False)
 			s1.setUnits("substance")
-			s1.setHasOnlySubstanceUnits( True )
+			s1.setHasOnlySubstanceUnits( False )
 			if moose.exists(spe.path+'/info'):
 				Anno = moose.Annotator(spe.path+'/info')
-				notesS = Anno.notes
-				if notesS != "":
-					cleanNotesS= convertNotesSpecialChar(notesS)
+				if Anno.notes != "":
+					cleanNotesS= convertNotesSpecialChar(Anno.notes)
 					notesStringS = "<body xmlns=\"http://www.w3.org/1999/xhtml\">\n \t \t"+ cleanNotesS + "\n\t </body>"
 					s1.setNotes(notesStringS)
-			#FindGroupName
-			element = moose.element(spe)
-			ele = getGroupinfo(element)
-			if ele.className == "Neutral":
-				speciGpname = "<moose:Group>"+ ele.name + "</moose:Group>\n"
-				speciannoexist = True
-			if speciannoexist :
-				speciAnno = "<moose:ModelAnnotation>\n"
-				if speciGpname:
-					speciAnno = speciAnno + speciGpname
-				speciAnno = speciAnno+ "</moose:ModelAnnotation>"
+				
+				element = moose.element(spe)
+				ele = getGroupinfo(element)
+				if ele.className == "Neutral" or sceneitems[element] or Anno.color or Anno.textColor:
+					speciannoexist = True
+				if speciannoexist :
+					speciAnno = "<moose:ModelAnnotation>\n"
+					if ele.className == "Neutral":
+						speciAnno = speciAnno + "<moose:Group>"+ ele.name + "</moose:Group>\n"
+					if sceneitems[element]:
+						v = sceneitems[element]
+						if autoCoordinateslayout == False:
+							speciAnno = speciAnno+"<moose:xCord>"+str(v['x'])+"</moose:xCord>\n"+"<moose:yCord>"+str(v['y'])+"</moose:yCord>\n"
+						elif autoCoordinateslayout == True:
+							x = calPrime(v['x'])
+							y = calPrime(v['y'])
+							speciAnno = speciAnno+"<moose:xCord>"+str(x)+"</moose:xCord>\n"+"<moose:yCord>"+str(y)+"</moose:yCord>\n"
+					if Anno.color:
+						speciAnno = speciAnno+"<moose:bgColor>"+Anno.color+"</moose:bgColor>\n"
+					if Anno.textColor:
+						speciAnno = speciAnno+"<moose:textColor>"+Anno.textColor+"</moose:textColor>\n"
+					speciAnno = speciAnno+ "</moose:ModelAnnotation>"
+					s1.setAnnotation(speciAnno)
 	return True
 
 def writeCompt(modelpath,cremodel_):
@@ -586,13 +693,14 @@ def writeCompt(modelpath,cremodel_):
 #write Simulation runtime,simdt,plotdt 
 def writeSimulationAnnotation(modelpath):
 	modelAnno = ""
+	plots = ""
 	if moose.exists(modelpath+'/info'):
 		mooseclock = moose.Clock('/clock')
 		modelAnno ="<moose:ModelAnnotation>\n"
 		modelAnnotation = moose.element(modelpath+'/info')
-		modelAnno = modelAnno+"<moose:ModelTime> "+str(modelAnnotation.runtime)+" </moose:ModelTime>\n"
-		modelAnno = modelAnno+"<moose:ModelSolver> "+modelAnnotation.solver+" </moose:ModelSolver>\n"
-		modelAnno = modelAnno+"<moose:simdt>"+ str(mooseclock.dts[11]) + " </moose:simdt>\n";
+		modelAnno = modelAnno+"<moose:runTime> "+str(modelAnnotation.runtime)+" </moose:runTime>\n"
+		modelAnno = modelAnno+"<moose:solver> "+modelAnnotation.solver+" </moose:solver>\n"
+		modelAnno = modelAnno+"<moose:simdt>"+ str(mooseclock.dts[12]) + " </moose:simdt>\n";
 		modelAnno = modelAnno+"<moose:plotdt> " + str(mooseclock.dts[18]) +" </moose:plotdt>\n";
 		plots = "";
 		graphs = moose.wildcardFind(modelpath+"/##[TYPE=Table2]")
@@ -601,15 +709,19 @@ def writeSimulationAnnotation(modelpath):
 			if len(gpath) != 0:
 				q = moose.element(gpath[0])
 				ori = q.path
+				name = convertSpecialChar(q.name)
 				graphSpefound = False
 				while not(isinstance(moose.element(q),moose.CubeMesh)):
 					q = q.parent
 					graphSpefound = True
 				if graphSpefound:
 					if not plots:
-						plots = ori[ori.find(q.name)-1:len(ori)]
+						#plots = ori[ori.find(q.name)-1:len(ori)]
+						plots = '/'+q.name+'/'+name
+
 					else:
-						plots = plots + "; "+ori[ori.find(q.name)-1:len(ori)]
+						#plots = plots + "; "+ori[ori.find(q.name)-1:len(ori)]
+						plots = plots + "; /"+q.name+'/'+name
 		if plots != " ":
 			modelAnno = modelAnno+ "<moose:plots> "+ plots+ "</moose:plots>\n";
 		modelAnno = modelAnno+"</moose:ModelAnnotation>"
@@ -627,16 +739,16 @@ def writeUnits(cremodel_):
 	unitSub = cremodel_.createUnitDefinition()
 	unitSub.setId("substance")
 	unit = unitSub.createUnit()
-	unit.setKind( UNIT_KIND_ITEM )
+	unit.setKind( UNIT_KIND_MOLE )
 	unit.setMultiplier(1)
 	unit.setExponent(1.0)
-	unit.setScale(0)
+	unit.setScale(-3)
 	
 
 def validateModel( sbmlDoc ):
 	#print " sbmlDoc ",sbmlDoc.toSBML()
 	if ( not sbmlDoc ):
-		print "validateModel: given a null SBML Document"
+		print("validateModel: given a null SBML Document")
 		return False
 	consistencyMessages    = ""
 	validationMessages     = ""
@@ -683,35 +795,216 @@ def validateModel( sbmlDoc ):
 	if ( noProblems ):
 		return True
 	else:
+		if consistencyMessages == None:
+			consistencyMessages = ""
 		if consistencyMessages != "":
-			print " consistency Warning: "+consistencyMessages
+			print(" consistency Warning: "+consistencyMessages)
 		
 		if ( numConsistencyErrors > 0 ):
 			if numConsistencyErrors == 1: t = "" 
 			else: t="s"
-			print "ERROR: encountered " + numConsistencyErrors + " consistency error" +t+ " in model '" + sbmlDoc.getModel().getId() + "'."
+			print("ERROR: encountered " + numConsistencyErrors + " consistency error" +t+ " in model '" + sbmlDoc.getModel().getId() + "'.")
 	if ( numConsistencyWarnings > 0 ):
 		if numConsistencyWarnings == 1:
 			t1 = "" 
 		else: t1 ="s"
-		print "Notice: encountered " + numConsistencyWarnings +" consistency warning" + t + " in model '" + sbmlDoc.getModel().getId() + "'."
+		print("Notice: encountered " + numConsistencyWarnings +" consistency warning" + t + " in model '" + sbmlDoc.getModel().getId() + "'.")
 	  	
 	if ( numValidationErrors > 0 ):
 		if numValidationErrors == 1:
 			t2 = "" 
 		else: t2 ="s" 
-		print "ERROR: encountered " + numValidationErrors  + " validation error" + t2 + " in model '" + sbmlDoc.getModel().getId() + "'."
+		print("ERROR: encountered " + numValidationErrors  + " validation error" + t2 + " in model '" + sbmlDoc.getModel().getId() + "'.")
 		if ( numValidationWarnings > 0 ):
 			if numValidationWarnings == 1:
 				t3 = "" 
 			else: t3 = "s"
 
-			print "Notice: encountered " + numValidationWarnings + " validation warning" + t3 + " in model '" + sbmlDoc.getModel().getId() + "'." 
+			print("Notice: encountered " + numValidationWarnings + " validation warning" + t3 + " in model '" + sbmlDoc.getModel().getId() + "'.") 
 		
-		print validationMessages;
+		print(validationMessages);
 	return ( numConsistencyErrors == 0 and numValidationErrors == 0)
 	#return ( numConsistencyErrors == 0 and numValidationErrors == 0, consistencyMessages)
 
+def setupItem(modelPath):
+    '''This function collects information of what is connected to what. \
+    eg. substrate and product connectivity to reaction's and enzyme's \
+    sumtotal connectivity to its pool are collected '''
+    #print " setupItem"
+    sublist = []
+    prdlist = []
+    cntDict = {}
+    zombieType = ['ReacBase','EnzBase','Function','StimulusTable']
+    for baseObj in zombieType:
+        path = '/##[ISA='+baseObj+']'
+        if modelPath != '/':
+            path = modelPath+path
+        if ( (baseObj == 'ReacBase') or (baseObj == 'EnzBase')):
+            for items in wildcardFind(path):
+                sublist = []
+                prdlist = []
+                uniqItem,countuniqItem = countitems(items,'subOut')
+                subNo = uniqItem
+                for sub in uniqItem: 
+                    sublist.append((element(sub),'s',countuniqItem[sub]))
+
+                uniqItem,countuniqItem = countitems(items,'prd')
+                prdNo = uniqItem
+                if (len(subNo) == 0 or len(prdNo) == 0):
+                    print("Substrate Product is empty "," ",items)
+                for prd in uniqItem:
+                	prdlist.append((element(prd),'p',countuniqItem[prd]))
+                
+                if (baseObj == 'CplxEnzBase') :
+                    uniqItem,countuniqItem = countitems(items,'toEnz')
+                    for enzpar in uniqItem:
+                        sublist.append((element(enzpar),'t',countuniqItem[enzpar]))
+                    
+                    uniqItem,countuniqItem = countitems(items,'cplxDest')
+                    for cplx in uniqItem:
+                        prdlist.append((element(cplx),'cplx',countuniqItem[cplx]))
+
+                if (baseObj == 'EnzBase'):
+                    uniqItem,countuniqItem = countitems(items,'enzDest')
+                    for enzpar in uniqItem:
+                        sublist.append((element(enzpar),'t',countuniqItem[enzpar]))
+                cntDict[items] = sublist,prdlist
+        elif baseObj == 'Function':
+            for items in wildcardFind(path):
+                sublist = []
+                prdlist = []
+                item = items.path+'/x[0]'
+                uniqItem,countuniqItem = countitems(item,'input')
+                for funcpar in uniqItem:
+                    sublist.append((element(funcpar),'sts',countuniqItem[funcpar]))
+                
+                uniqItem,countuniqItem = countitems(items,'valueOut')
+                for funcpar in uniqItem:
+                    prdlist.append((element(funcpar),'stp',countuniqItem[funcpar]))
+                cntDict[items] = sublist,prdlist
+        else:
+            for tab in wildcardFind(path):
+                tablist = []
+                uniqItem,countuniqItem = countitems(tab,'output')
+                for tabconnect in uniqItem:
+                    tablist.append((element(tabconnect),'tab',countuniqItem[tabconnect]))
+                cntDict[tab] = tablist
+    return cntDict
+
+def countitems(mitems,objtype):
+    items = []
+    #print "mitems in countitems ",mitems,objtype
+    items = element(mitems).neighbors[objtype]
+    uniqItems = set(items)
+    countuniqItems = Counter(items)
+    return(uniqItems,countuniqItems)
+
+def setupMeshObj(modelRoot):
+    ''' Setup compartment and its members pool,reaction,enz cplx under self.meshEntry dictionaries \ 
+    self.meshEntry with "key" as compartment, 
+    value is key2:list where key2 represents moose object type,list of objects of a perticular type
+    e.g self.meshEntry[meshEnt] = { 'reaction': reaction_list,'enzyme':enzyme_list,'pool':poollist,'cplx': cplxlist }
+    '''
+    meshEntry = {}
+    if meshEntry:
+        meshEntry.clear()
+    else:
+        meshEntry = {}
+    meshEntryWildcard = '/##[ISA=ChemCompt]'
+    if modelRoot != '/':
+        meshEntryWildcard = modelRoot+meshEntryWildcard
+    for meshEnt in wildcardFind(meshEntryWildcard):
+        mollist = []
+        cplxlist = []
+        mol_cpl  = wildcardFind(meshEnt.path+'/##[ISA=PoolBase]')
+        funclist = wildcardFind(meshEnt.path+'/##[ISA=Function]')
+        enzlist  = wildcardFind(meshEnt.path+'/##[ISA=EnzBase]')
+        realist  = wildcardFind(meshEnt.path+'/##[ISA=ReacBase]')
+        tablist  = wildcardFind(meshEnt.path+'/##[ISA=StimulusTable]')
+        if mol_cpl or funclist or enzlist or realist or tablist:
+            for m in mol_cpl:
+                if isinstance(element(m.parent),CplxEnzBase):
+                    cplxlist.append(m)
+                elif isinstance(element(m),PoolBase):
+                    mollist.append(m)
+                    
+            meshEntry[meshEnt] = {'enzyme':enzlist,
+                                  'reaction':realist,
+                                  'pool':mollist,
+                                  'cplx':cplxlist,
+                                  'table':tablist,
+                                  'function':funclist
+                                  }
+    return(meshEntry)
+
+def autoCoordinates(meshEntry,srcdesConnection):
+    G = nx.Graph()
+    for cmpt,memb in list(meshEntry.items()):
+        for enzObj in find_index(memb,'enzyme'):
+            G.add_node(enzObj.path,label='',shape='ellipse',color='',style='filled',fontname='Helvetica',fontsize=12,fontcolor='blue')
+    for cmpt,memb in list(meshEntry.items()):
+        for poolObj in find_index(memb,'pool'):
+            #poolinfo = moose.element(poolObj.path+'/info')
+            G.add_node(poolObj.path,label = poolObj.name,shape = 'box',color = '',style = 'filled',fontname = 'Helvetica',fontsize = 12,fontcolor = 'blue')
+        for cplxObj in find_index(memb,'cplx'):
+            pass
+        for reaObj in find_index(memb,'reaction'):
+            G.add_node(reaObj.path,label='',shape='record',color='')
+        
+    for inn,out in list(srcdesConnection.items()):
+        if (inn.className =='ZombieReac'): arrowcolor = 'green'
+        elif(inn.className =='ZombieEnz'): arrowcolor = 'red'
+        else: arrowcolor = 'blue'
+        if isinstance(out,tuple):
+            if len(out[0])== 0:
+                print(inn.className + ':' +inn.name + "  doesn't have input message")
+            else:
+                for items in (items for items in out[0] ):
+                	G.add_edge(element(items[0]).path,inn.path)
+                	
+            if len(out[1]) == 0:
+                print(inn.className + ':' + inn.name + "doesn't have output mssg")
+            else:
+                for items in (items for items in out[1] ):
+                	G.add_edge(inn.path,element(items[0]).path)
+                	
+        elif isinstance(out,list):
+            if len(out) == 0:
+                print("Func pool doesn't have sumtotal")
+            else:
+                for items in (items for items in out ):
+                	G.add_edge(element(items[0]).path,inn.path)
+    #from networkx.drawing.nx_agraph import graphviz_layout
+    #position = graphviz_layout(G,prog='dot')
+
+    position = nx.pygraphviz_layout(G, prog = 'dot')
+    position = nx.spring_layout(G)
+    #agraph = nx.to_agraph(G)
+    #agraph.draw("~/out.png", format = 'png', prog = 'dot')
+
+    sceneitems = {}
+    xycord = []
+    cmin = 0
+    cmax = 0
+    for key,value in list(position.items()):
+        xycord.append(value[0])
+        xycord.append(value[1])
+        sceneitems[element(key)] = {'x':value[0],'y':value[1]}
+    if len(xycord) > 0:
+    	cmin = min(xycord)
+    	cmax = max(xycord)
+    return cmin,cmax,sceneitems
+
+def calPrime(x):
+    prime = int((20*(float(x-cmin)/float(cmax-cmin)))-10)
+    return prime
+
+def find_index(value, key):
+    """ Value.get(key) to avoid expection which would raise if empty value in dictionary for a given key """
+    if value.get(key) != None:
+        return value.get(key)
+    else:
+        raise ValueError('no dict with the key found')
 if __name__ == "__main__":
 
 	filepath = sys.argv[1]
@@ -726,9 +1019,9 @@ if __name__ == "__main__":
 	
 	moose.loadModel(filepath,loadpath,"gsl")
 	
-	written = mooseWriteSBML(loadpath,filepath)
+	written,c,writtentofile = mooseWriteSBML(loadpath,filepath)
 	if written:
-		print " File written to ",written
+		print(" File written to ",writtentofile)
 	else:
-		print " could not write model to SBML file"
+		print(" could not write model to SBML file")
 	
\ No newline at end of file
diff --git a/python/moose/genesis/_main.py b/python/moose/genesis/_main.py
index 67a15b98..38c0492d 100644
--- a/python/moose/genesis/_main.py
+++ b/python/moose/genesis/_main.py
@@ -14,6 +14,10 @@ import random
 from moose import wildcardFind, element, loadModel, ChemCompt, exists, Annotator, Pool, ZombiePool,PoolBase,CplxEnzBase,Function,ZombieFunction
 import numpy as np
 import re
+from collections import Counter
+import networkx as nx
+from PyQt4.QtGui import QColor
+import matplotlib 
 
 GENESIS_COLOR_SEQUENCE = ((248, 0, 255), (240, 0, 255), (232, 0, 255), (224, 0, 255), (216, 0, 255), (208, 0, 255),
  (200, 0, 255), (192, 0, 255), (184, 0, 255), (176, 0, 255), (168, 0, 255), (160, 0, 255), (152, 0, 255), (144, 0, 255),
@@ -33,565 +37,710 @@ GENESIS_COLOR_SEQUENCE = ((248, 0, 255), (240, 0, 255), (232, 0, 255), (224, 0,
  (255, 56, 0), (255, 48, 0), (255, 40, 0), (255, 32, 0), (255, 24, 0), (255, 16, 0), (255, 8, 0), (255, 0, 0))
 
 #Todo : To be written
-#               --Notes
 #               --StimulusTable
 
 def write( modelpath, filename,sceneitems=None):
-        if filename.rfind('.') != -1:
-            filename = filename[:filename.rfind('.')]
-        else:
-            filename = filename[:len(filename)]
-        filename = filename+'.g'
-        global NA
-        NA = 6.0221415e23
-        global xmin,xmax,ymin,ymax
-        global cord
-        global multi
-        xmin = ymin = 0
-        xmax = ymax = 1
-        multi = 50
-        cord = {}
-        compt = wildcardFind(modelpath+'/##[ISA=ChemCompt]')
-        maxVol = estimateDefaultVol(compt)
-        f = open(filename, 'w')
-        writeHeader (f,maxVol)
-        if (compt > 0):
-                if sceneitems == None:
-                        #if sceneitems is none (loaded from script) then check x,y cord exists
-                        xmin,ymin,xmax,ymax,positionInfoExist = getCor(modelpath,sceneitems)
-                        if not positionInfoExist:
-                                #incase of SBML or cspace or python Annotator is not populated then positionInfoExist= False
-                                #print " x and y cordinates doesn't exist so auto cordinates"
-                                print(" auto co-ordinates needs to be applied")
-                                pass
-                else:
-                        #This is when it comes from Gui where the objects are already layout on to scene
-                        # so using thoes co-ordinates
-                        xmin,ymin,xmax,ymax,positionInfoExist = getCor(modelpath,sceneitems)
-                gtId_vol = writeCompartment(modelpath,compt,f)
-                writePool(modelpath,f,gtId_vol)
-                reacList = writeReac(modelpath,f)
-                enzList = writeEnz(modelpath,f)
-                writeSumtotal(modelpath,f)
-                storeReacMsg(reacList,f)
-                storeEnzMsg(enzList,f)
-                writeGui(f)
-                tgraphs = wildcardFind(modelpath+'/##[ISA=Table2]')
-                if tgraphs:
-                        writeplot(tgraphs,f)
-                        storePlotMsgs(tgraphs,f)
-                writeFooter1(f)
-                writeNotes(modelpath,f)
-                writeFooter2(f)
-                return True
+    if filename.rfind('.') != -1:
+        filename = filename[:filename.rfind('.')]
+    else:
+        filename = filename[:len(filename)]
+    filename = filename+'.g'
+    global NA
+    NA = 6.0221415e23
+    global cmin,cmax,xmin,xmax,ymin,ymax
+    cmin = 0 
+    cmax = 1
+    xmin = 0
+    xmax = 1
+    ymin = 0
+    ymax = 1
+
+    compt = wildcardFind(modelpath+'/##[ISA=ChemCompt]')
+    maxVol = estimateDefaultVol(compt)
+    f = open(filename, 'w')
+
+    if sceneitems == None:
+        srcdesConnection = {}
+        setupItem(modelpath,srcdesConnection)
+        meshEntry = setupMeshObj(modelpath)
+        cmin,cmax,sceneitems = autoCoordinates(meshEntry,srcdesConnection)
+        for k,v in sceneitems.items():
+            v = sceneitems[k]
+            x1 = calPrime(v['x'])
+            y1 = calPrime(v['y'])
+            sceneitems[k]['x'] = x1
+            sceneitems[k]['y'] = y1
+    else:
+        cs, xcord, ycord = [], [] ,[]
+        for k,v in sceneitems.items():
+            xcord.append(v['x'])
+            cs.append(v['x'])
+            ycord.append(v['y'])
+            cs.append(v['y'])
+        xmin = min(xcord)
+        xmax = max(xcord)
+        ymin = min(ycord)
+        ymax = max(ycord)
+
+        cmin = min(cs)
+        cmax = max(cs)
+    writeHeader (f,maxVol)
+    
+    if (compt > 0):
+        gtId_vol = writeCompartment(modelpath,compt,f)
+        writePool(modelpath,f,gtId_vol,sceneitems)
+        reacList = writeReac(modelpath,f,sceneitems)
+        enzList = writeEnz(modelpath,f,sceneitems)
+        writeSumtotal(modelpath,f)
+        f.write("simundump xgraph /graphs/conc1 0 0 99 0.001 0.999 0\n"
+                "simundump xgraph /graphs/conc2 0 0 100 0 1 0\n")
+        tgraphs = wildcardFind(modelpath+'/##[ISA=Table2]')
+        first, second = " ", " "
+        if tgraphs:
+            first,second = writeplot(tgraphs,f)
+        if first:
+            f.write(first)
+        f.write("simundump xgraph /moregraphs/conc3 0 0 100 0 1 0\n"
+                "simundump xgraph /moregraphs/conc4 0 0 100 0 1 0\n")
+        if second:
+            f.write(second)
+        f.write("simundump xcoredraw /edit/draw 0 -6 4 -2 6\n"
+                "simundump xtree /edit/draw/tree 0 \\\n"
+                "  /kinetics/#[],/kinetics/#[]/#[],/kinetics/#[]/#[]/#[][TYPE!=proto],/kinetics/#[]/#[]/#[][TYPE!=linkinfo]/##[] \"edit_elm.D <v>; drag_from_edit.w <d> <S> <x> <y> <z>\" auto 0.6\n"
+                "simundump xtext /file/notes 0 1\n")
+        storeReacMsg(reacList,f)
+        storeEnzMsg(enzList,f)
+        if tgraphs:
+            storePlotMsgs(tgraphs,f)
+        writeFooter1(f)
+        writeNotes(modelpath,f)
+        writeFooter2(f)
+        return True
+    else:
+        print("Warning: writeKkit:: No model found on " , modelpath)
+        return False
+
+def calPrime(x):
+    prime = int((20*(float(x-cmin)/float(cmax-cmin)))-10)
+    return prime
+
+def setupItem(modelPath,cntDict):
+    '''This function collects information of what is connected to what. \
+    eg. substrate and product connectivity to reaction's and enzyme's \
+    sumtotal connectivity to its pool are collected '''
+    #print " setupItem"
+    sublist = []
+    prdlist = []
+    zombieType = ['ReacBase','EnzBase','Function','StimulusTable']
+    for baseObj in zombieType:
+        path = '/##[ISA='+baseObj+']'
+        if modelPath != '/':
+            path = modelPath+path
+        if ( (baseObj == 'ReacBase') or (baseObj == 'EnzBase')):
+            for items in wildcardFind(path):
+                sublist = []
+                prdlist = []
+                uniqItem,countuniqItem = countitems(items,'subOut')
+                subNo = uniqItem
+                for sub in uniqItem: 
+                    sublist.append((element(sub),'s',countuniqItem[sub]))
+
+                uniqItem,countuniqItem = countitems(items,'prd')
+                prdNo = uniqItem
+                if (len(subNo) == 0 or len(prdNo) == 0):
+                    print "Substrate Product is empty ",path, " ",items
+                    
+                for prd in uniqItem:
+                    prdlist.append((element(prd),'p',countuniqItem[prd]))
+                
+                if (baseObj == 'CplxEnzBase') :
+                    uniqItem,countuniqItem = countitems(items,'toEnz')
+                    for enzpar in uniqItem:
+                        sublist.append((element(enzpar),'t',countuniqItem[enzpar]))
+                    
+                    uniqItem,countuniqItem = countitems(items,'cplxDest')
+                    for cplx in uniqItem:
+                        prdlist.append((element(cplx),'cplx',countuniqItem[cplx]))
+
+                if (baseObj == 'EnzBase'):
+                    uniqItem,countuniqItem = countitems(items,'enzDest')
+                    for enzpar in uniqItem:
+                        sublist.append((element(enzpar),'t',countuniqItem[enzpar]))
+                cntDict[items] = sublist,prdlist
+        elif baseObj == 'Function':
+            for items in wildcardFind(path):
+                sublist = []
+                prdlist = []
+                item = items.path+'/x[0]'
+                uniqItem,countuniqItem = countitems(item,'input')
+                for funcpar in uniqItem:
+                    sublist.append((element(funcpar),'sts',countuniqItem[funcpar]))
+                
+                uniqItem,countuniqItem = countitems(items,'valueOut')
+                for funcpar in uniqItem:
+                    prdlist.append((element(funcpar),'stp',countuniqItem[funcpar]))
+                cntDict[items] = sublist,prdlist
         else:
-                print("Warning: writeKkit:: No model found on " , modelpath)
-                return False
+            for tab in wildcardFind(path):
+                tablist = []
+                uniqItem,countuniqItem = countitems(tab,'output')
+                for tabconnect in uniqItem:
+                    tablist.append((element(tabconnect),'tab',countuniqItem[tabconnect]))
+                cntDict[tab] = tablist
+def countitems(mitems,objtype):
+    items = []
+    #print "mitems in countitems ",mitems,objtype
+    items = element(mitems).neighbors[objtype]
+    uniqItems = set(items)
+    countuniqItems = Counter(items)
+    return(uniqItems,countuniqItems)
+
+def setupMeshObj(modelRoot):
+    ''' Setup compartment and its members pool,reaction,enz cplx under self.meshEntry dictionaries \ 
+    self.meshEntry with "key" as compartment, 
+    value is key2:list where key2 represents moose object type,list of objects of a perticular type
+    e.g self.meshEntry[meshEnt] = { 'reaction': reaction_list,'enzyme':enzyme_list,'pool':poollist,'cplx': cplxlist }
+    '''
+    meshEntry = {}
+    if meshEntry:
+        meshEntry.clear()
+    else:
+        meshEntry = {}
+    meshEntryWildcard = '/##[ISA=ChemCompt]'
+    if modelRoot != '/':
+        meshEntryWildcard = modelRoot+meshEntryWildcard
+    for meshEnt in wildcardFind(meshEntryWildcard):
+        mollist = []
+        cplxlist = []
+        mol_cpl  = wildcardFind(meshEnt.path+'/##[ISA=PoolBase]')
+        funclist = wildcardFind(meshEnt.path+'/##[ISA=Function]')
+        enzlist  = wildcardFind(meshEnt.path+'/##[ISA=EnzBase]')
+        realist  = wildcardFind(meshEnt.path+'/##[ISA=ReacBase]')
+        tablist  = wildcardFind(meshEnt.path+'/##[ISA=StimulusTable]')
+        if mol_cpl or funclist or enzlist or realist or tablist:
+            for m in mol_cpl:
+                if isinstance(element(m.parent),CplxEnzBase):
+                    cplxlist.append(m)
+                elif isinstance(element(m),PoolBase):
+                    mollist.append(m)
+                    
+            meshEntry[meshEnt] = {'enzyme':enzlist,
+                                  'reaction':realist,
+                                  'pool':mollist,
+                                  'cplx':cplxlist,
+                                  'table':tablist,
+                                  'function':funclist
+                                  }
+    return(meshEntry)
+def autoCoordinates(meshEntry,srcdesConnection):
+    G = nx.Graph()
+    for cmpt,memb in meshEntry.items():
+        for enzObj in find_index(memb,'enzyme'):
+            #G.add_node(enzObj.path)
+            G.add_node(enzObj.path,label=enzObj.name,shape='ellipse',color='',style='filled',fontname='Helvetica',fontsize=12,fontcolor='blue')
+    for cmpt,memb in meshEntry.items():
+        for poolObj in find_index(memb,'pool'):
+            #G.add_node(poolObj.path)
+            G.add_node(poolObj.path,label = poolObj.name,shape = 'box',color = '',style = 'filled',fontname = 'Helvetica',fontsize = 12,fontcolor = 'blue')
+        for cplxObj in find_index(memb,'cplx'):
+            pass
+            #G.add_node(cplxObj.path)
+            #G.add_edge((cplxObj.parent).path,cplxObj.path)
+        for reaObj in find_index(memb,'reaction'):
+            #G.add_node(reaObj.path)
+            G.add_node(reaObj.path,label=reaObj.name,shape='record',color='')
+        for funcObj in find_index(memb,'function'):
+            G.add_node(poolObj.path,label = funcObj.name,shape = 'box',color = 'red',style = 'filled',fontname = 'Helvetica',fontsize = 12,fontcolor = 'blue')
+
+        
+    for inn,out in srcdesConnection.items():
+        if (inn.className =='ZombieReac'): arrowcolor = 'green'
+        elif(inn.className =='ZombieEnz'): arrowcolor = 'red'
+        else: arrowcolor = 'blue'
+        if isinstance(out,tuple):
+            if len(out[0])== 0:
+                print inn.className + ':' +inn.name + "  doesn't have input message"
+            else:
+                for items in (items for items in out[0] ):
+                    G.add_edge(element(items[0]).path,inn.path)
+            if len(out[1]) == 0:
+                print inn.className + ':' + inn.name + "doesn't have output mssg"
+            else:
+                for items in (items for items in out[1] ):
+                    G.add_edge(inn.path,element(items[0]).path)
+        elif isinstance(out,list):
+            if len(out) == 0:
+                print "Func pool doesn't have sumtotal"
+            else:
+                for items in (items for items in out ):
+                    G.add_edge(element(items[0]).path,inn.path)
+    
+    position = nx.graphviz_layout(G, prog = 'dot')
+    #agraph = nx.to_agraph(G)
+    #agraph.draw("writetogenesis.png", format = 'png', prog = 'dot')
+    sceneitems = {}
+    xycord = []
+
+    for key,value in position.items():
+        xycord.append(value[0])
+        xycord.append(value[1])
+        sceneitems[element(key)] = {'x':value[0],'y':value[1]}
+    cmin = min(xycord)
+    cmax = max(xycord)
+    return cmin,cmax,sceneitems
+
+def find_index(value, key):
+    """ Value.get(key) to avoid expection which would raise if empty value in dictionary for a given key """
+    if value.get(key) != None:
+        return value.get(key)
+    else:
+        raise ValueError('no dict with the key found')
 
 def storeCplxEnzMsgs( enz, f ):
-        for sub in enz.neighbors["subOut"]:
-                s = "addmsg /kinetics/" + trimPath( sub ) + " /kinetics/" + trimPath(enz) + " SUBSTRATE n \n";
-                s = s+ "addmsg /kinetics/" + trimPath( enz ) + " /kinetics/" + trimPath( sub ) +        " REAC sA B \n";
-                f.write(s)
-        for prd in enz.neighbors["prd"]:
-                s = "addmsg /kinetics/" + trimPath( enz ) + " /kinetics/" + trimPath(prd) + " MM_PRD pA\n";
-                f.write( s )
-        for enzOut in enz.neighbors["enzOut"]:
-                s = "addmsg /kinetics/" + trimPath( enzOut ) + " /kinetics/" + trimPath(enz) + " ENZYME n\n";
-                s = s+ "addmsg /kinetics/" + trimPath( enz ) + " /kinetics/" + trimPath(enzOut) + " REAC eA B\n";
-                f.write( s )
+    for sub in enz.neighbors["subOut"]:
+        s = "addmsg /kinetics/" + trimPath( sub ) + " /kinetics/" + trimPath(enz) + " SUBSTRATE n \n";
+        s = s+ "addmsg /kinetics/" + trimPath( enz ) + " /kinetics/" + trimPath( sub ) +        " REAC sA B \n";
+        f.write(s)
+    for prd in enz.neighbors["prd"]:
+        s = "addmsg /kinetics/" + trimPath( enz ) + " /kinetics/" + trimPath(prd) + " MM_PRD pA\n";
+        f.write( s )
+    for enzOut in enz.neighbors["enzOut"]:
+        s = "addmsg /kinetics/" + trimPath( enzOut ) + " /kinetics/" + trimPath(enz) + " ENZYME n\n";
+        s = s+ "addmsg /kinetics/" + trimPath( enz ) + " /kinetics/" + trimPath(enzOut) + " REAC eA B\n";
+        f.write( s )
 
 def storeMMenzMsgs( enz, f):
-        subList = enz.neighbors["subOut"]
-        prdList = enz.neighbors["prd"]
-        enzDestList = enz.neighbors["enzDest"]
-        for esub in subList:
-                es = "addmsg /kinetics/" + trimPath(element(esub)) + " /kinetics/" + trimPath(enz) + " SUBSTRATE n \n";
-                es = es+"addmsg /kinetics/" + trimPath(enz) + " /kinetics/" + trimPath(element(esub)) + " REAC sA B \n";
-                f.write(es)
-
-        for eprd in prdList:
-                es = "addmsg /kinetics/" + trimPath( enz ) + " /kinetics/" + trimPath( element(eprd)) + " MM_PRD pA \n";
-                f.write(es)
-        for eenzDest in enzDestList:
-                enzDest = "addmsg /kinetics/" + trimPath( element(eenzDest)) + " /kinetics/" + trimPath( enz ) + " ENZYME n \n";
-                f.write(enzDest)
+    subList = enz.neighbors["subOut"]
+    prdList = enz.neighbors["prd"]
+    enzDestList = enz.neighbors["enzDest"]
+    for esub in subList:
+        es = "addmsg /kinetics/" + trimPath(element(esub)) + " /kinetics/" + trimPath(enz) + " SUBSTRATE n \n";
+        es = es+"addmsg /kinetics/" + trimPath(enz) + " /kinetics/" + trimPath(element(esub)) + " REAC sA B \n";
+        f.write(es)
+
+    for eprd in prdList:
+        es = "addmsg /kinetics/" + trimPath( enz ) + " /kinetics/" + trimPath( element(eprd)) + " MM_PRD pA \n";
+        f.write(es)
+    for eenzDest in enzDestList:
+        enzDest = "addmsg /kinetics/" + trimPath( element(eenzDest)) + " /kinetics/" + trimPath( enz ) + " ENZYME n \n";
+        f.write(enzDest)
 
 def storeEnzMsg( enzList, f):
-        for enz in enzList:
-                enzClass = enz.className
-                if (enzClass == "ZombieMMenz" or enzClass == "MMenz"):
-                        storeMMenzMsgs(enz, f)
-                else:
-                        storeCplxEnzMsgs( enz, f )
-
-def writeEnz( modelpath,f):
-        enzList = wildcardFind(modelpath+'/##[ISA=EnzBase]')
-        for enz in enzList:
-                x = random.randrange(0,10)
-                y = random.randrange(0,10)
-                textcolor = "green"
-                color = "red"
-                k1 = 0;
-                k2 = 0;
-                k3 = 0;
-                nInit = 0;
-                concInit = 0;
-                n = 0;
-                conc = 0;
-                enzParent = enz.parent
-                if (isinstance(enzParent.className,Pool)) or (isinstance(enzParent.className,ZombiePool)):
-                        print(" raise exception enz doesn't have pool as parent")
-                        return False
-                else:
-                        vol = enzParent.volume * NA * 1e-3;
-                        isMichaelisMenten = 0;
-                        enzClass = enz.className
-                        if (enzClass == "ZombieMMenz" or enzClass == "MMenz"):
-                                k1 = enz.numKm
-                                k3 = enz.kcat
-                                k2 = 4.0*k3;
-                                k1 = (k2 + k3) / k1;
-                                isMichaelisMenten = 1;
-
-                        elif (enzClass == "ZombieEnz" or enzClass == "Enz"):
-                                k1 = enz.k1
-                                k2 = enz.k2
-                                k3 = enz.k3
-                                cplx = enz.neighbors['cplx'][0]
-                                nInit = cplx.nInit[0];
-
-                        xe = cord[enz]['x']
-                        ye = cord[enz]['y']
-                        x = ((xe-xmin)/(xmax-xmin))*multi
-                        y = ((ye-ymin)/(ymax-ymin))*multi
-                        #y = ((ymax-ye)/(ymax-ymin))*multi
-                        einfo = enz.path+'/info'
-                        if exists(einfo):
-                                color = Annotator(einfo).getField('color')
-                                color = getColorCheck(color,GENESIS_COLOR_SEQUENCE)
-
-                                textcolor = Annotator(einfo).getField('textColor')
-                                textcolor = getColorCheck(textcolor,GENESIS_COLOR_SEQUENCE)
-
-                        f.write("simundump kenz /kinetics/" + trimPath(enz) + " " + str(0)+  " " +
-                                        str(concInit) + " " +
-                                        str(conc) + " " +
-                                        str(nInit) + " " +
-                                        str(n) + " " +
-                                        str(vol) + " " +
-                                        str(k1) + " " +
-                                        str(k2) + " " +
-                                        str(k3) + " " +
-                                        str(0) + " " +
-                                        str(isMichaelisMenten) + " " +
-                                        "\"\"" + " " +
-                                        str(color) + " " + str(textcolor) + " \"\"" +
-                                        " " + str(int(x)) + " " + str(int(y)) + " "+str(0)+"\n")
-        return enzList
+    for enz in enzList:
+        enzClass = enz.className
+        if (enzClass == "ZombieMMenz" or enzClass == "MMenz"):
+            storeMMenzMsgs(enz, f)
+        else:
+            storeCplxEnzMsgs( enz, f )
+
+def writeEnz( modelpath,f,sceneitems):
+    enzList = wildcardFind(modelpath+'/##[ISA=EnzBase]')
+    for enz in enzList:
+        x = random.randrange(0,10)
+        y = random.randrange(0,10)
+        textcolor = ""
+        color = ""
+        k1 = 0;
+        k2 = 0;
+        k3 = 0;
+        nInit = 0;
+        concInit = 0;
+        n = 0;
+        conc = 0;
+        enzParent = enz.parent
+        if (isinstance(enzParent.className,Pool)) or (isinstance(enzParent.className,ZombiePool)):
+            print(" raise exception enz doesn't have pool as parent")
+            return False
+        else:
+            vol = enzParent.volume * NA * 1e-3;
+            isMichaelisMenten = 0;
+            enzClass = enz.className
+            if (enzClass == "ZombieMMenz" or enzClass == "MMenz"):
+                k1 = enz.numKm
+                k3 = enz.kcat
+                k2 = 4.0*k3;
+                k1 = (k2 + k3) / k1;
+                isMichaelisMenten = 1;
+
+            elif (enzClass == "ZombieEnz" or enzClass == "Enz"):
+                k1 = enz.k1
+                k2 = enz.k2
+                k3 = enz.k3
+                cplx = enz.neighbors['cplx'][0]
+                nInit = cplx.nInit[0];
+            if sceneitems != None:
+                # value = sceneitems[enz]
+                # x = calPrime(value['x'])
+                # y = calPrime(value['y'])
+                x = sceneitems[enz]['x']
+                y = sceneitems[enz]['y']
+
+            einfo = enz.path+'/info'
+            if exists(einfo):
+                color = Annotator(einfo).getField('color')
+                color = getColorCheck(color,GENESIS_COLOR_SEQUENCE)
+
+                textcolor = Annotator(einfo).getField('textColor')
+                textcolor = getColorCheck(textcolor,GENESIS_COLOR_SEQUENCE)
+
+            if color == "" or color == " ":
+                color = getRandColor()
+            if textcolor == ""  or textcolor == " ":
+                textcolor = getRandColor()
+
+            f.write("simundump kenz /kinetics/" + trimPath(enz) + " " + str(0)+  " " +
+                str(concInit) + " " +
+                str(conc) + " " +
+                str(nInit) + " " +
+                str(n) + " " +
+                str(vol) + " " +
+                str(k1) + " " +
+                str(k2) + " " +
+                str(k3) + " " +
+                str(0) + " " +
+                str(isMichaelisMenten) + " " +
+                "\"\"" + " " +
+                str(textcolor) + " " + str(color) + " \"\"" +
+                " " + str(int(x)) + " " + str(int(y)) + " "+str(0)+"\n")
+    return enzList
 
 def nearestColorIndex(color, color_sequence):
-        #Trying to find the index to closest color map from the rainbow pickle file for matching the Genesis color map
-        distance = [ (color[0] - temp[0]) ** 2 + (color[1] - temp[1]) ** 2 + (color[2] - temp[2]) ** 2
-                     for temp in color_sequence]
+    #Trying to find the index to closest color map from the rainbow pickle file for matching the Genesis color map
+    distance = [ (color[0] - temp[0]) ** 2 + (color[1] - temp[1]) ** 2 + (color[2] - temp[2]) ** 2
+                 for temp in color_sequence]
 
-        minindex = 0
+    minindex = 0
 
-        for i in range(1, len(distance)):
-                if distance[minindex] > distance[i] : minindex = i
+    for i in range(1, len(distance)):
+        if distance[minindex] > distance[i] : minindex = i
 
-        return minindex
+    return minindex
 
 def storeReacMsg(reacList,f):
-        for reac in reacList:
-                reacPath = trimPath( reac);
-                sublist = reac.neighbors["subOut"]
-                prdlist = reac.neighbors["prd"]
-                for sub in sublist:
-                        s = "addmsg /kinetics/" + trimPath( sub ) + " /kinetics/" + reacPath +  " SUBSTRATE n \n";
-                        s =  s + "addmsg /kinetics/" + reacPath + " /kinetics/" + trimPath( sub ) +  " REAC A B \n";
-                        f.write(s)
-
-                for prd in prdlist:
-                        s = "addmsg /kinetics/" + trimPath( prd ) + " /kinetics/" + reacPath + " PRODUCT n \n";
-                        s = s + "addmsg /kinetics/" + reacPath + " /kinetics/" + trimPath( prd ) +  " REAC B A\n";
-                        f.write( s)
-
-def writeReac(modelpath,f):
-        reacList = wildcardFind(modelpath+'/##[ISA=ReacBase]')
-        for reac in reacList :
-                color = "blue"
-                textcolor = "red"
-                kf = reac.numKf
-                kb = reac.numKb
-                xr = cord[reac]['x']
-                yr = cord[reac]['y']
-                x = ((xr-xmin)/(xmax-xmin))*multi
-                y = ((yr-ymin)/(ymax-ymin))*multi
-                #y = ((ymax-yr)/(ymax-ymin))*multi
-                rinfo = reac.path+'/info'
-                if exists(rinfo):
-                        color = Annotator(rinfo).getField('color')
-                        color = getColorCheck(color,GENESIS_COLOR_SEQUENCE)
-
-                        textcolor = Annotator(rinfo).getField('textColor')
-                        textcolor = getColorCheck(textcolor,GENESIS_COLOR_SEQUENCE)
-
-                f.write("simundump kreac /kinetics/" + trimPath(reac) + " " +str(0) +" "+ str(kf) + " " + str(kb) + " \"\" " +
-                        str(color) + " " + str(textcolor) + " " + str(int(x)) + " " + str(int(y)) + " 0\n")
-        return reacList
-
+    for reac in reacList:
+        reacPath = trimPath( reac);
+        sublist = reac.neighbors["subOut"]
+        prdlist = reac.neighbors["prd"]
+        for sub in sublist:
+            s = "addmsg /kinetics/" + trimPath( sub ) + " /kinetics/" + reacPath +  " SUBSTRATE n \n";
+            s =  s + "addmsg /kinetics/" + reacPath + " /kinetics/" + trimPath( sub ) +  " REAC A B \n";
+            f.write(s)
+
+        for prd in prdlist:
+            s = "addmsg /kinetics/" + trimPath( prd ) + " /kinetics/" + reacPath + " PRODUCT n \n";
+            s = s + "addmsg /kinetics/" + reacPath + " /kinetics/" + trimPath( prd ) +  " REAC B A\n";
+            f.write( s)
+
+def writeReac(modelpath,f,sceneitems):
+    reacList = wildcardFind(modelpath+'/##[ISA=ReacBase]')
+    for reac in reacList :
+        color = ""
+        textcolor = ""
+        kf = reac.numKf
+        kb = reac.numKb
+        # if sceneitems != None:
+        #     value = sceneitems[reac]
+        #     x = calPrime(value['x'])
+        #     y = calPrime(value['y'])
+        x = sceneitems[reac]['x']
+        y = sceneitems[reac]['y']
+        rinfo = reac.path+'/info'
+        if exists(rinfo):
+            color = Annotator(rinfo).getField('color')
+            color = getColorCheck(color,GENESIS_COLOR_SEQUENCE)
+
+            textcolor = Annotator(rinfo).getField('textColor')
+            textcolor = getColorCheck(textcolor,GENESIS_COLOR_SEQUENCE)
+        
+        if color == "" or color == " ":
+            color = getRandColor()
+        if textcolor == ""  or textcolor == " ":
+            textcolor = getRandColor()
+        f.write("simundump kreac /kinetics/" + trimPath(reac) + " " +str(0) +" "+ str(kf) + " " + str(kb) + " \"\" " +
+                str(color) + " " + str(textcolor) + " " + str(int(x)) + " " + str(int(y)) + " 0\n")
+    return reacList
+ 
 def trimPath(mobj):
-        original = mobj
-        mobj = element(mobj)
-        found = False
-        while not isinstance(mobj,ChemCompt) and mobj.path != "/":
-                mobj = element(mobj.parent)
-                found = True
-        if mobj.path == "/":
-                print("compartment is not found with the given path and the path has reached root ",original)
-                return
-        #other than the kinetics compartment, all the othername are converted to group in Genesis which are place under /kinetics
-        # Any moose object comes under /kinetics then one level down the path is taken.
-        # e.g /group/poolObject or /Reac
-        if found:
-                if mobj.name != "kinetics":
-                        splitpath = original.path[(original.path.find(mobj.name)):len(original.path)]
-                else:
-
-                        pos = original.path.find(mobj.name)
-                        slash = original.path.find('/',pos+1)
-                        splitpath = original.path[slash+1:len(original.path)]
-                splitpath = re.sub("\[[0-9]+\]", "", splitpath)
-                s = splitpath.replace("_dash_",'-')
-                return s
+    original = mobj
+    mobj = element(mobj)
+    found = False
+    while not isinstance(mobj,ChemCompt) and mobj.path != "/":
+        mobj = element(mobj.parent)
+        found = True
+    if mobj.path == "/":
+        print("compartment is not found with the given path and the path has reached root ",original)
+        return
+    #other than the kinetics compartment, all the othername are converted to group in Genesis which are place under /kinetics
+    # Any moose object comes under /kinetics then one level down the path is taken.
+    # e.g /group/poolObject or /Reac
+    if found:
+        if mobj.name != "kinetics" and (mobj.className != "CubeMesh"):
+            print " 478 ",mobj.name,mobj.className
+            splitpath = original.path[(original.path.find(mobj.name)):len(original.path)]
+            print " splitpath ",original,splitpath
+        else:
+            pos = original.path.find(mobj.name)
+            slash = original.path.find('/',pos+1)
+            splitpath = original.path[slash+1:len(original.path)]
+        splitpath = re.sub("\[[0-9]+\]", "", splitpath)
+        s = splitpath.replace("_dash_",'-')
+        return s
 
 def writeSumtotal( modelpath,f):
-        funclist = wildcardFind(modelpath+'/##[ISA=Function]')
-        for func in funclist:
-                funcInputs = element(func.path+'/x[0]')
-                s = ""
-                for funcInput in funcInputs.neighbors["input"]:
-                        s = s+ "addmsg /kinetics/" + trimPath(funcInput)+ " /kinetics/" + trimPath(element(func.parent)) + " SUMTOTAL n nInit\n"
-                f.write(s)
-
-def storePlotMsgs( tgraphs,f):
+    funclist = wildcardFind(modelpath+'/##[ISA=Function]')
+    for func in funclist:
+        funcInputs = element(func.path+'/x[0]')
         s = ""
-        if tgraphs:
-                for graph in tgraphs:
-                        slash = graph.path.find('graphs')
-                        if not slash > -1:
-                                slash = graph.path.find('graph_0')
-                        if slash > -1:
-                                conc = graph.path.find('conc')
-                                if conc > -1 :
-                                        tabPath = graph.path[slash:len(graph.path)]
-                                else:
-                                        slash1 = graph.path.find('/',slash)
-                                        tabPath = "/graphs/conc1" +graph.path[slash1:len(graph.path)]
-
-                                if len(element(graph).msgOut):
-                                        poolPath = (element(graph).msgOut)[0].e2.path
-                                        poolEle = element(poolPath)
-                                        poolName = poolEle.name
-                                        bgPath = (poolEle.path+'/info')
-                                        bg = Annotator(bgPath).color
-                                        bg = getColorCheck(bg,GENESIS_COLOR_SEQUENCE)
-                                        tabPath = re.sub("\[[0-9]+\]", "", tabPath)
-                                        s = s+"addmsg /kinetics/" + trimPath( poolEle ) + " " + tabPath + \
-                                                " PLOT Co *" + poolName + " *" + str(bg) +"\n";
+        for funcInput in funcInputs.neighbors["input"]:
+            s = s+ "addmsg /kinetics/" + trimPath(funcInput)+ " /kinetics/" + trimPath(element(func.parent)) + " SUMTOTAL n nInit\n"
         f.write(s)
 
-def writeplot( tgraphs,f ):
-        if tgraphs:
-                for graphs in tgraphs:
-                        slash = graphs.path.find('graphs')
-                        if not slash > -1:
-                                slash = graphs.path.find('graph_0')
-                        if slash > -1:
-                                conc = graphs.path.find('conc')
-                                if conc > -1 :
-                                        tabPath = "/"+graphs.path[slash:len(graphs.path)]
-                                else:
-                                        slash1 = graphs.path.find('/',slash)
-                                        tabPath = "/graphs/conc1" +graphs.path[slash1:len(graphs.path)]
-
-                                if len(element(graphs).msgOut):
-                                        poolPath = (element(graphs).msgOut)[0].e2.path
-                                        poolEle = element(poolPath)
-                                        poolAnno = (poolEle.path+'/info')
-                                        fg = Annotator(poolAnno).textColor
-                                        fg = getColorCheck(fg,GENESIS_COLOR_SEQUENCE)
-                                        tabPath = re.sub("\[[0-9]+\]", "", tabPath)
-                                        f.write("simundump xplot " + tabPath + " 3 524288 \\\n" + "\"delete_plot.w <s> <d>; edit_plot.D <w>\" " + fg + " 0 0 1\n")
-
-def writePool(modelpath,f,volIndex):
-        for p in wildcardFind(modelpath+'/##[ISA=PoolBase]'):
-                slave_enable = 0
-                if (p.className == "BufPool" or p.className == "ZombieBufPool"):
-                        pool_children = p.children
-                        if pool_children== 0:
-                                slave_enable = 4
-                        else:
-                                for pchild in pool_children:
-                                        if not(pchild.className == "ZombieFunction") and not(pchild.className == "Function"):
-                                                slave_enable = 4
-                                        else:
-                                                slave_enable = 0
-                                                break
-                #Eliminated enzyme complex pool
-                if ((p.parent).className != "Enz" and (p.parent).className != "ZombieEnz"):
-                    xp = cord[p]['x']
-                    yp = cord[p]['y']
-                    x = ((xp-xmin)/(xmax-xmin))*multi
-                    y = ((yp-ymin)/(ymax-ymin))*multi
-                    #y = ((ymax-yp)/(ymax-ymin))*multi
-                    pinfo = p.path+'/info'
-                    if exists(pinfo):
-                            color = Annotator(pinfo).getField('color')
-                            color = getColorCheck(color,GENESIS_COLOR_SEQUENCE)
-
-                            textcolor = Annotator(pinfo).getField('textColor')
-                            textcolor = getColorCheck(textcolor,GENESIS_COLOR_SEQUENCE)
-                    geometryName = volIndex[p.volume]
-                    volume = p.volume * NA * 1e-3
-                    f.write("simundump kpool /kinetics/" + trimPath(p) + " 0 " +
-                            str(p.diffConst) + " " +
-                            str(0) + " " +
-                            str(0) + " " +
-                            str(0) + " " +
-                            str(p.nInit) + " " +
-                            str(0) + " " + str(0) + " " +
-                            str(volume)+ " " +
-                            str(slave_enable) +
-                            " /kinetics"+ geometryName + " " +
-                            str(color) +" " + str(textcolor) + " " + str(int(x)) + " " + str(int(y)) + " "+ str(0)+"\n")
-                # print " notes ",notes
-                # return notes
+def storePlotMsgs( tgraphs,f):
+    s = ""
+    if tgraphs:
+        for graph in tgraphs:
+            slash = graph.path.find('graphs')
+            if not slash > -1:
+                slash = graph.path.find('graph_0')
+            if slash > -1:
+                conc = graph.path.find('conc')
+                if conc > -1 :
+                    tabPath = graph.path[slash:len(graph.path)]
+                else:
+                    slash1 = graph.path.find('/',slash)
+                    tabPath = "/graphs/conc1" +graph.path[slash1:len(graph.path)]
+
+                if len(element(graph).msgOut):
+                    poolPath = (element(graph).msgOut)[0].e2.path
+                    poolEle = element(poolPath)
+                    poolName = poolEle.name
+                    bgPath = (poolEle.path+'/info')
+                    bg = Annotator(bgPath).color
+                    bg = getColorCheck(bg,GENESIS_COLOR_SEQUENCE)
+                    tabPath = re.sub("\[[0-9]+\]", "", tabPath)
+                    s = s+"addmsg /kinetics/" + trimPath( poolEle ) + " " + tabPath + \
+                            " PLOT Co *" + poolName + " *" + str(bg) +"\n";
+    f.write(s)
 
-def getColorCheck(color,GENESIS_COLOR_SEQUENCE):
-        if isinstance(color, str):
-                if color.startswith("#"):
-                        color = ( int(color[1:3], 16)
-                                            , int(color[3:5], 16)
-                                            , int(color[5:7], 16)
-                                            )
-                        index = nearestColorIndex(color, GENESIS_COLOR_SEQUENCE)
-                        return index
-                elif color.startswith("("):
-                        color = eval(color)[0:3]
-                        index = nearestColorIndex(color, GENESIS_COLOR_SEQUENCE)
-                        return index
+def writeplot( tgraphs,f ):
+    first, second = " ", " "
+    if tgraphs:
+        for graphs in tgraphs:
+            slash = graphs.path.find('graphs')
+            if not slash > -1:
+                slash = graphs.path.find('graph_0')
+            if slash > -1:
+                conc = graphs.path.find('conc')
+                if conc > -1 :
+                    tabPath = "/"+graphs.path[slash:len(graphs.path)]
                 else:
-                        index = color
-                        return index
-        elif isinstance(color, tuple):
-                color = map(int, color)[0:3]
-                index = nearestColorIndex(color, GENESIS_COLOR_SEQUENCE)
-                return index
-        elif isinstance(color, int):
-                index = color
-                return index
+                    slash1 = graphs.path.find('/',slash)
+                    tabPath = "/graphs/conc1" +graphs.path[slash1:len(graphs.path)]
+                if len(element(graphs).msgOut):
+                    poolPath = (element(graphs).msgOut)[0].e2.path
+                    poolEle = element(poolPath)
+                    poolAnno = (poolEle.path+'/info')
+                    fg = Annotator(poolAnno).textColor
+                    fg = getColorCheck(fg,GENESIS_COLOR_SEQUENCE)
+                    tabPath = re.sub("\[[0-9]+\]", "", tabPath)
+                    if tabPath.find("conc1") >= 0 or tabPath.find("conc2") >= 0:
+                        first = first + "simundump xplot " + tabPath + " 3 524288 \\\n" + "\"delete_plot.w <s> <d>; edit_plot.D <w>\" " + fg + " 0 0 1\n"
+                    if tabPath.find("conc3") >= 0 or tabPath.find("conc4") >= 0:
+                        second = second + "simundump xplot " + tabPath + " 3 524288 \\\n" + "\"delete_plot.w <s> <d>; edit_plot.D <w>\" " + fg + " 0 0 1\n"
+    return first,second
+
+def writePool(modelpath,f,volIndex,sceneitems):
+    print " modelpath ",modelpath
+    color = ""
+    textcolor = ""
+    for p in wildcardFind(modelpath+'/##[ISA=PoolBase]'):
+        slave_enable = 0
+        if (p.className == "BufPool" or p.className == "ZombieBufPool"):
+            pool_children = p.children
+            if pool_children== 0:
+                slave_enable = 4
+            else:
+                for pchild in pool_children:
+                    if not(pchild.className == "ZombieFunction") and not(pchild.className == "Function"):
+                        slave_enable = 4
+                    else:
+                        slave_enable = 0
+                        break
+        if (p.parent.className != "Enz" and p.parent.className !='ZombieEnz'):
+            #Assuming "p.parent.className !=Enzyme is cplx which is not written to genesis"
+            x = sceneitems[p]['x']
+            y = sceneitems[p]['y']
+            # if sceneitems != None:
+            #     value = sceneitems[p]
+            #     x = calPrime(value['x'])
+            #     y = calPrime(value['y'])
+                
+            pinfo = p.path+'/info'
+            if exists(pinfo):
+                color = Annotator(pinfo).getField('color')
+                color = getColorCheck(color,GENESIS_COLOR_SEQUENCE)
+                textcolor = Annotator(pinfo).getField('textColor')
+                textcolor = getColorCheck(textcolor,GENESIS_COLOR_SEQUENCE)
+            
+            geometryName = volIndex[p.volume]
+            volume = p.volume * NA * 1e-3
+            if color == "" or color == " ":
+                color = getRandColor()
+            if textcolor == ""  or textcolor == " ":
+                textcolor = getRandColor()
+	    print " trimPath",trimPath(p)
+            f.write("simundump kpool /kinetics/" + trimPath(p) + " 0 " +
+                    str(p.diffConst) + " " +
+                    str(0) + " " +
+                    str(0) + " " +
+                    str(0) + " " +
+                    str(p.nInit) + " " +
+                    str(0) + " " + str(0) + " " +
+                    str(volume)+ " " +
+                    str(slave_enable) +
+                    " /kinetics"+ geometryName + " " +
+                    str(color) +" " + str(textcolor) + " " + str(int(x)) + " " + str(int(y)) + " "+ str(0)+"\n")
+            
+def getColorCheck(color,GENESIS_COLOR_SEQUENCE):
+    if isinstance(color, str):
+        if color.startswith("#"):
+            color = ( int(color[1:3], 16)
+                                , int(color[3:5], 16)
+                                , int(color[5:7], 16)
+                                )
+            index = nearestColorIndex(color, GENESIS_COLOR_SEQUENCE)
+            index = index/2
+            return index
+        elif color.startswith("("):
+            color = eval(color)[0:3]
+            index = nearestColorIndex(color, GENESIS_COLOR_SEQUENCE)
+            #This is because in genesis file index of the color is half
+            index = index/2
+            return index
         else:
-                raise Exception("Invalid Color Value!")
-
-def getxyCord(xcord,ycord,list1,sceneitems):
-        for item in list1:
-                if not ( isinstance(item,Function) and isinstance(item,ZombieFunction) ):
-                        if sceneitems == None:
-                                objInfo = item.path+'/info'
-                                xpos = xyPosition(objInfo,'x')
-                                ypos = xyPosition(objInfo,'y')
-                        else:
-                                co = sceneitems[item]
-                                xpos = co.scenePos().x()
-                                ypos =-co.scenePos().y()
-                                #xpos = co['x']
-                                #ypos = co['y']
-
-                        cord[item] ={ 'x': xpos,'y':ypos}
-                        xcord.append(xpos)
-                        ycord.append(ypos)
-
-def xyPosition(objInfo,xory):
-    try:
-        return(float(element(objInfo).getField(xory)))
-    except ValueError:
-        return (float(0))
-def getCor(modelRoot,sceneitems):
-        xmin = ymin = 0.0
-        xmax = ymax = 1.0
-        positionInfoExist = False
-        xcord = []
-        ycord = []
-        mollist = realist = enzlist = cplxlist = tablist = funclist = []
-        meshEntryWildcard = '/##[ISA=ChemCompt]'
-        if modelRoot != '/':
-                meshEntryWildcard = modelRoot+meshEntryWildcard
-        for meshEnt in wildcardFind(meshEntryWildcard):
-                mol_cpl  = wildcardFind(meshEnt.path+'/##[ISA=PoolBase]')
-                realist  = wildcardFind(meshEnt.path+'/##[ISA=ReacBase]')
-                enzlist  = wildcardFind(meshEnt.path+'/##[ISA=EnzBase]')
-                funclist = wildcardFind(meshEnt.path+'/##[ISA=Function]')
-                tablist  = wildcardFind(meshEnt.path+'/##[ISA=StimulusTable]')
-                if mol_cpl or funclist or enzlist or realist or tablist:
-                        for m in mol_cpl:
-                                if isinstance(element(m.parent),CplxEnzBase):
-                                        cplxlist.append(m)
-                                        objInfo = m.parent.path+'/info'
-                                elif isinstance(element(m),PoolBase):
-                                        mollist.append(m)
-                                        objInfo =m.path+'/info'
-                                        #xx = xyPosition(objInfo,'x')
-                                        #yy = xyPosition(objInfo,'y')
-                                        
-                                
-                                if sceneitems == None:
-                                        xx = xyPosition(objInfo,'x')
-                                        yy = xyPosition(objInfo,'y')
-                                else:
-                                    c = sceneitems[m]
-                                    xx = c.scenePos().x()
-                                    yy =-c.scenePos().y()
-                                    #listq = sceneitems[m]
-                                    #xx = listq['x']
-                                    #yy = listq['y']
-                                    
-                                cord[m] ={ 'x': xx,'y':yy}
-                                xcord.append(xx)
-                                ycord.append(yy)
-                                
-                        getxyCord(xcord,ycord,realist,sceneitems)
-                        getxyCord(xcord,ycord,enzlist,sceneitems)
-                        getxyCord(xcord,ycord,funclist,sceneitems)
-                        getxyCord(xcord,ycord,tablist,sceneitems)
-        xmin = min(xcord)
-        xmax = max(xcord)
-        ymin = min(ycord)
-        ymax = max(ycord)
-        positionInfoExist = not(len(np.nonzero(xcord)[0]) == 0 \
-                and len(np.nonzero(ycord)[0]) == 0)
-
-        return(xmin,ymin,xmax,ymax,positionInfoExist)
-
+            index = color
+            return index
+    elif isinstance(color, tuple):
+        color = map(int, color)[0:3]
+        index = nearestColorIndex(color, GENESIS_COLOR_SEQUENCE)
+        return index
+    elif isinstance(color, int):
+        index = color
+        return index
+    else:
+        raise Exception("Invalid Color Value!")
+
+ignoreColor= ["mistyrose","antiquewhite","aliceblue","azure","bisque","black","blanchedalmond","blue","cornsilk","darkolivegreen","darkslategray","dimgray","floralwhite","gainsboro","ghostwhite","honeydew","ivory","lavender","lavenderblush","lemonchiffon","lightcyan","lightgoldenrodyellow","lightgray","lightyellow","linen","mediumblue","mintcream","navy","oldlace","papayawhip","saddlebrown","seashell","snow","wheat","white","whitesmoke","aquamarine","lightsalmon","moccasin","limegreen","snow","sienna","beige","dimgrey","lightsage"]
+matplotcolor = {}
+for name,hexno in matplotlib.colors.cnames.iteritems():
+    matplotcolor[name]=hexno
+
+def getRandColor():
+    k = random.choice(matplotcolor.keys())
+    if k in ignoreColor:
+        return getRandColor()
+    else:
+        return k
 def writeCompartment(modelpath,compts,f):
-        index = 0
-        volIndex = {}
-        for compt in compts:
-                if compt.name != "kinetics":
-                        xgrp = xmax -random.randrange(1,10)
-                        ygrp = ymin +random.randrange(1,10)
-                        x = ((xgrp-xmin)/(xmax-xmin))*multi
-                        #y = ((ymax-ygrp)/(ymax-ymin))*multi
-                        y = ((ygrp-ymin)/(ymax-ymin))*multi
-                        f.write("simundump group /kinetics/" + compt.name + " 0 " + "blue" + " " + "green"       + " x 0 0 \"\" defaultfile \\\n" )
-                        f.write( "  defaultfile.g 0 0 0 " + str(int(x)) + " " + str(int(y)) + " 0\n")
-        i = 0
-        l = len(compts)
-        geometry = ""
-        for compt in compts:
-                size = compt.volume
-                ndim = compt.numDimensions
-                vecIndex = l-i-1
-                #print vecIndex
-                i = i+1
-                xgeo = xmax -random.randrange(1,10)
-                ygeo = ymin +random.randrange(1,10)
-                x = ((xgeo-xmin)/(xmax-xmin))*multi
-                #y = ((ymax-ygeo)/(ymax-ymin))*multi
-                y = ((ygeo-ymin)/(ymax-ymin))*multi
-                if vecIndex > 0:
-                        geometry = geometry+"simundump geometry /kinetics" +  "/geometry[" + str(vecIndex) +"] 0 " + str(size) + " " + str(ndim) + " sphere " +" \"\" white black "+ str(int(x)) + " " +str(int(y)) +" 0\n";
-                        volIndex[size] = "/geometry["+str(vecIndex)+"]"
-                else:
-                        geometry = geometry+"simundump geometry /kinetics"  +  "/geometry 0 " + str(size) + " " + str(ndim) + " sphere " +" \"\" white black " + str(int(x)) + " "+str(int(y))+ " 0\n";
-                        volIndex[size] = "/geometry"
-                f.write(geometry)
-        writeGroup(modelpath,f,xmax,ymax)
-        return volIndex
-
-def writeGroup(modelpath,f,xmax,ymax):
-        ignore = ["graphs","moregraphs","geometry","groups","conc1","conc2","conc3","conc4","model","data","graph_0","graph_1","graph_2","graph_3","graph_4","graph_5"]
-        for g in wildcardFind(modelpath+'/##[TYPE=Neutral]'):
-                if not g.name in ignore:
-                        if trimPath(g) != None:
-                                xgrp1 = xmax - random.randrange(1,10)
-                                ygrp1 = ymin + random.randrange(1,10)
-                                x = ((xgrp1-xmin)/(xmax-xmin))*multi
-                                #y = ((ymax-ygrp1)/(ymax-ymin))*multi
-                                y = ((ygrp1-ymin)/(ymax-ymin))*multi
-                                f.write("simundump group /kinetics/" + trimPath(g) + " 0 " +    "blue" + " " + "green"   + " x 0 0 \"\" defaultfile \\\n")
-                                f.write("  defaultfile.g 0 0 0 " + str(int(x)) + " " + str(int(y)) + " 0\n")
+    index = 0
+    volIndex = {}
+    for compt in compts:
+        if compt.name != "kinetics":
+            x = xmin+6
+            y = ymax+1
+            f.write("simundump group /kinetics/" + compt.name + " 0 " + "blue" + " " + "green"       + " x 0 0 \"\" defaultfile \\\n" )
+            f.write( "  defaultfile.g 0 0 0 " + str(int(x)) + " " + str(int(y)) + " 0\n")
+    i = 0
+    l = len(compts)
+    geometry = ""
+    for compt in compts:
+        size = compt.volume
+        ndim = compt.numDimensions
+        vecIndex = l-i-1
+        i = i+1
+        x = xmin+4
+        y = ymax+1
+        if vecIndex > 0:
+            geometry = geometry+"simundump geometry /kinetics" +  "/geometry[" + str(vecIndex) +"] 0 " + str(size) + " " + str(ndim) + " sphere " +" \"\" white black "+ str(int(x)) + " " +str(int(y)) +" 0\n";
+            volIndex[size] = "/geometry["+str(vecIndex)+"]"
+        else:
+            geometry = geometry+"simundump geometry /kinetics"  +  "/geometry 0 " + str(size) + " " + str(ndim) + " sphere " +" \"\" white black " + str(int(x)) + " "+str(int(y))+ " 0\n";
+            volIndex[size] = "/geometry"
+        f.write(geometry)
+    writeGroup(modelpath,f)
+    return volIndex
+
+def writeGroup(modelpath,f):
+    ignore = ["graphs","moregraphs","geometry","groups","conc1","conc2","conc3","conc4","model","data","graph_0","graph_1","graph_2","graph_3","graph_4","graph_5"]
+    for g in wildcardFind(modelpath+'/##[TYPE=Neutral]'):
+        if not g.name in ignore:
+            if trimPath(g) != None:
+                x = xmin+1
+                y = ymax+1
+                f.write("simundump group /kinetics/" + trimPath(g) + " 0 " +    "blue" + " " + "green"   + " x 0 0 \"\" defaultfile \\\n")
+                f.write("  defaultfile.g 0 0 0 " + str(int(x)) + " " + str(int(y)) + " 0\n")
 
 def writeHeader(f,maxVol):
-        simdt = 0.001
-        plotdt = 0.1
-        rawtime = 100
-        maxtime = 100
-        defaultVol = maxVol
-        f.write("//genesis\n"
-                        "// kkit Version 11 flat dumpfile\n\n"
-                        "// Saved on " + str(rawtime)+"\n"
-                        "include kkit {argv 1}\n"
-                        "FASTDT = " + str(simdt)+"\n"
-                        "SIMDT = " +str(simdt)+"\n"
-                        "CONTROLDT = " +str(plotdt)+"\n"
-                        "PLOTDT = " +str(plotdt)+"\n"
-                        "MAXTIME = " +str(maxtime)+"\n"
-                        "TRANSIENT_TIME = 2"+"\n"
-                        "VARIABLE_DT_FLAG = 0"+"\n"
-                        "DEFAULT_VOL = " +str(defaultVol)+"\n"
-                        "VERSION = 11.0 \n"
-                        "setfield /file/modpath value ~/scripts/modules\n"
-                        "kparms\n\n"
-                        )
-        f.write( "//genesis\n"
-                        "initdump -version 3 -ignoreorphans 1\n"
-                        "simobjdump table input output alloced step_mode stepsize x y z\n"
-                        "simobjdump xtree path script namemode sizescale\n"
-                        "simobjdump xcoredraw xmin xmax ymin ymax\n"
-                        "simobjdump xtext editable\n"
-                        "simobjdump xgraph xmin xmax ymin ymax overlay\n"
-                        "simobjdump xplot pixflags script fg ysquish do_slope wy\n"
-                        "simobjdump group xtree_fg_req xtree_textfg_req plotfield expanded movealone \\\n"
-                                "  link savename file version md5sum mod_save_flag x y z\n"
-                        "simobjdump geometry size dim shape outside xtree_fg_req xtree_textfg_req x y z\n"
-                        "simobjdump kpool DiffConst CoInit Co n nInit mwt nMin vol slave_enable \\\n"
-                                "  geomname xtree_fg_req xtree_textfg_req x y z\n"
-                        "simobjdump kreac kf kb notes xtree_fg_req xtree_textfg_req x y z\n"
-                        "simobjdump kenz CoComplexInit CoComplex nComplexInit nComplex vol k1 k2 k3 \\\n"
-                                "  keepconc usecomplex notes xtree_fg_req xtree_textfg_req link x y z\n"
-                        "simobjdump stim level1 width1 delay1 level2 width2 delay2 baselevel trig_time \\\n"
-                                "  trig_mode notes xtree_fg_req xtree_textfg_req is_running x y z\n"
-                        "simobjdump xtab input output alloced step_mode stepsize notes editfunc \\\n"
-                                "  xtree_fg_req xtree_textfg_req baselevel last_x last_y is_running x y z\n"
-                        "simobjdump kchan perm gmax Vm is_active use_nernst notewriteReacs xtree_fg_req \\\n"
-                                "  xtree_textfg_req x y z\n"
-                        "simobjdump transport input output alloced step_mode stepsize dt delay clock \\\n"
-                                "  kf xtree_fg_req xtree_textfg_req x y z\n"
-                        "simobjdump proto x y z\n"
-                        )
+    simdt = 0.001
+    plotdt = 0.1
+    rawtime = 100
+    maxtime = 100
+    defaultVol = maxVol
+    f.write("//genesis\n"
+            "// kkit Version 11 flat dumpfile\n\n"
+            "// Saved on " + str(rawtime)+"\n"
+            "include kkit {argv 1}\n"
+            "FASTDT = " + str(simdt)+"\n"
+            "SIMDT = " +str(simdt)+"\n"
+            "CONTROLDT = " +str(plotdt)+"\n"
+            "PLOTDT = " +str(plotdt)+"\n"
+            "MAXTIME = " +str(maxtime)+"\n"
+            "TRANSIENT_TIME = 2"+"\n"
+            "VARIABLE_DT_FLAG = 0"+"\n"
+            "DEFAULT_VOL = " +str(defaultVol)+"\n"
+            "VERSION = 11.0 \n"
+            "setfield /file/modpath value ~/scripts/modules\n"
+            "kparms\n\n"
+            )
+    f.write( "//genesis\n"
+            "initdump -version 3 -ignoreorphans 1\n"
+            "simobjdump table input output alloced step_mode stepsize x y z\n"
+            "simobjdump xtree path script namemode sizescale\n"
+            "simobjdump xcoredraw xmin xmax ymin ymax\n"
+            "simobjdump xtext editable\n"
+            "simobjdump xgraph xmin xmax ymin ymax overlay\n"
+            "simobjdump xplot pixflags script fg ysquish do_slope wy\n"
+            "simobjdump group xtree_fg_req xtree_textfg_req plotfield expanded movealone \\\n"
+                    "  link savename file version md5sum mod_save_flag x y z\n"
+            "simobjdump geometry size dim shape outside xtree_fg_req xtree_textfg_req x y z\n"
+            "simobjdump kpool DiffConst CoInit Co n nInit mwt nMin vol slave_enable \\\n"
+                    "  geomname xtree_fg_req xtree_textfg_req x y z\n"
+            "simobjdump kreac kf kb notes xtree_fg_req xtree_textfg_req x y z\n"
+            "simobjdump kenz CoComplexInit CoComplex nComplexInit nComplex vol k1 k2 k3 \\\n"
+                    "  keepconc usecomplex notes xtree_fg_req xtree_textfg_req link x y z\n"
+            "simobjdump stim level1 width1 delay1 level2 width2 delay2 baselevel trig_time \\\n"
+                    "  trig_mode notes xtree_fg_req xtree_textfg_req is_running x y z\n"
+            "simobjdump xtab input output alloced step_mode stepsize notes editfunc \\\n"
+                    "  xtree_fg_req xtree_textfg_req baselevel last_x last_y is_running x y z\n"
+            "simobjdump kchan perm gmax Vm is_active use_nernst notes xtree_fg_req \\\n"
+                    "  xtree_textfg_req x y z\n"
+            "simobjdump transport input output alloced step_mode stepsize dt delay clock \\\n"
+                    "  kf xtree_fg_req xtree_textfg_req x y z\n"
+            "simobjdump proto x y z\n"
+            )
 
 def estimateDefaultVol(compts):
-        maxVol = 0
-        vol = []
-        for compt in compts:
-                vol.append(compt.volume)
-        if len(vol) > 0:
-                return max(vol)
-        return maxVol
-
-def writeGui( f ):
-        f.write("simundump xgraph /graphs/conc1 0 0 99 0.001 0.999 0\n"
-        "simundump xgraph /graphs/conc2 0 0 100 0 1 0\n"
-        "simundump xgraph /moregraphs/conc3 0 0 100 0 1 0\n"
-        "simundump xgraph /moregraphs/conc4 0 0 100 0 1 0\n"
-        "simundump xcoredraw /edit/draw 0 -6 4 -2 6\n"
-        "simundump xtree /edit/draw/tree 0 \\\n"
-        "  /kinetics/#[],/kinetics/#[]/#[],/kinetics/#[]/#[]/#[][TYPE!=proto],/kinetics/#[]/#[]/#[][TYPE!=linkinfo]/##[] \"edit_elm.D <v>; drag_from_edit.w <d> <S> <x> <y> <z>\" auto 0.6\n"
-        "simundump xtext /file/notes 0 1\n")
+    maxVol = 0
+    vol = []
+    for compt in compts:
+        vol.append(compt.volume)
+    if len(vol) > 0:
+        return max(vol)
+    return maxVol
+
 def writeNotes(modelpath,f):
     notes = ""
+    #items = wildcardFind(modelpath+"/##[ISA=ChemCompt],/##[ISA=ReacBase],/##[ISA=PoolBase],/##[ISA=EnzBase],/##[ISA=Function],/##[ISA=StimulusTable]")
     items = []
     items = wildcardFind(modelpath+"/##[ISA=ChemCompt]") +\
             wildcardFind(modelpath+"/##[ISA=PoolBase]") +\
@@ -605,21 +754,22 @@ def writeNotes(modelpath,f):
             notes = Annotator(info).getField('notes')
             if (notes):
                 f.write("call /kinetics/"+ trimPath(item)+"/notes LOAD \ \n\""+Annotator(info).getField('notes')+"\"\n")
-    
+
 def writeFooter1(f):
     f.write("\nenddump\n // End of dump\n")
+
 def writeFooter2(f):
     f.write("complete_loading\n")
 
 if __name__ == "__main__":
-        import sys
-
-        filename = sys.argv[1]
-        modelpath = filename[0:filename.find('.')]
-        loadModel(filename,'/'+modelpath,"gsl")
-        output = filename.g
-        written = write('/'+modelpath,output)
-        if written:
-                print(" file written to ",output)
-        else:
-                print(" could be written to kkit format")
+    import sys
+
+    filename = sys.argv[1]
+    modelpath = filename[0:filename.find('.')]
+    loadModel(filename,'/'+modelpath,"gsl")
+    output = modelpath+"_4mmoose.g"
+    written = write('/'+modelpath,output)
+    if written:
+            print(" file written to ",output)
+    else:
+            print(" could be written to kkit format")
diff --git a/python/moose/moose.py b/python/moose/moose.py
index 51774dcb..7581c65f 100644
--- a/python/moose/moose.py
+++ b/python/moose/moose.py
@@ -164,7 +164,7 @@ def showfield(el, field='*', showtype=False):
         max_type_len = max(len(dtype) for dtype in value_field_dict.values())
         max_field_len = max(len(dtype) for dtype in value_field_dict.keys())
         print('\n[', el.path, ']')
-        for key, dtype in value_field_dict.items():
+        for key, dtype in sorted( value_field_dict.items() ):
             if dtype == 'bad' or key == 'this' or key == 'dummy' or key == 'me' or dtype.startswith('vector') or 'ObjId' in dtype:
                 continue
             value = el.getField(key)
@@ -291,6 +291,13 @@ def getfielddoc(tokens, indent=''):
             raise NameError('`%s` has no field called `%s`' 
                     % (tokens[0], tokens[1]))
                     
+def toUnicode( v, encoding= 'utf8' ):
+    #if isinstance(v, str):
+        #return v
+    try:
+        return v.decode(encoding)
+    except (AttributeError, UnicodeEncodeError):
+        return str(v)
     
 def getmoosedoc(tokens, inherited=False):
     """Return MOOSE builtin documentation.
@@ -327,16 +334,17 @@ def getmoosedoc(tokens, inherited=False):
         except ValueError:
             raise NameError('name \'%s\' not defined.' % (tokens[0]))
         if len(tokens) > 1:
-            docstring.write(getfielddoc(tokens))
+            docstring.write( toUnicode( getfielddoc(tokens)) )
         else:
-            docstring.write('%s\n' % (class_element.docs))
+            docstring.write( toUnicode('%s\n' % (class_element.docs) ) )
             append_finfodocs(tokens[0], docstring, indent)
             if inherited:
                 mro = eval('_moose.%s' % (tokens[0])).mro()
                 for class_ in mro[1:]:
                     if class_ == _moose.melement:
                         break
-                    docstring.write('\n\n#Inherited from %s#\n' % (class_.__name__))
+                    docstring.write( toUnicode( 
+                        '\n\n#Inherited from %s#\n' % (class_.__name__) ) )
                     append_finfodocs(class_.__name__, docstring, indent)
                     if class_ == _moose.Neutral:    # Neutral is the toplevel moose class
                         break
@@ -350,14 +358,14 @@ def append_finfodocs(classname, docstring, indent):
     except ValueError:
         raise NameError('class \'%s\' not defined.' % (classname))
     for ftype, rname in finfotypes:
-        docstring.write('\n*%s*\n' % (rname.capitalize()))
+        docstring.write( toUnicode( '\n*%s*\n' % (rname.capitalize())) )
         try:
             finfo = _moose.element('%s/%s' % (class_element.path, ftype))
             for field in finfo.vec:
-                docstring.write('%s%s: %s\n' % 
-                            (indent, field.fieldName, field.type))
+                docstring.write( toUnicode( 
+                    '%s%s: %s\n' % (indent, field.fieldName, field.type)) )
         except ValueError:
-            docstring.write('%sNone\n' % (indent))
+            docstring.write( toUnicode( '%sNone\n' % (indent) ) )
     
     
 # the global pager is set from pydoc even if the user asks for paged
diff --git a/python/moose/neuroml2/generated_neuroml.py b/python/moose/neuroml2/generated_neuroml.py
index 1db782c1..e2d4c5a3 100644
--- a/python/moose/neuroml2/generated_neuroml.py
+++ b/python/moose/neuroml2/generated_neuroml.py
@@ -2,71 +2,43 @@
 # -*- coding: utf-8 -*-
 
 #
-# Generated Sun Jul 28 10:18:38 2013 by generateDS.py version 2.10a.
+# Generated Sun Apr 17 15:01:27 2016 by generateDS.py version 2.22a.
+#
+# Command line options:
+#   ('-o', 'generated_neuroml.py')
+#   ('-s', 'generated_neuromlsub.py')
+#
+# Command line arguments:
+#   /home/subha/src/neuroml_dev/NeuroML2/Schemas/NeuroML2/NeuroML_v2beta.xsd
+#
+# Command line:
+#   /home/subha/.local/bin/generateDS.py -o "generated_neuroml.py" -s "generated_neuromlsub.py" /home/subha/src/neuroml_dev/NeuroML2/Schemas/NeuroML2/NeuroML_v2beta.xsd
+#
+# Current working directory (os.getcwd()):
+#   neuroml2
 #
 
-from __future__ import print_function
 import sys
-import getopt
 import re as re_
 import base64
 import datetime as datetime_
+import warnings as warnings_
+from lxml import etree as etree_
 
-try:
-    basestr
-except NameError:
-    basestr = str
-
-etree_ = None
-Verbose_import_ = False
-XMLParser_import_none, XMLParser_import_lxml, XMLParser_import_elementtree = 0, 1, 2
-XMLParser_import_library = None
-try:
-    # lxml
-    from lxml import etree as etree_
-    XMLParser_import_library = XMLParser_import_lxml
-    if Verbose_import_:
-        print("running with lxml.etree")
-except ImportError:
-    try:
-        # cElementTree from Python 2.5+
-        import xml.etree.cElementTree as etree_
-        XMLParser_import_library = XMLParser_import_elementtree
-        if Verbose_import_:
-            print("running with cElementTree on Python 2.5+")
-    except ImportError:
-        try:
-            # ElementTree from Python 2.5+
-            import xml.etree.ElementTree as etree_
-            XMLParser_import_library = XMLParser_import_elementtree
-            if Verbose_import_:
-                print("running with ElementTree on Python 2.5+")
-        except ImportError:
-            try:
-                # normal cElementTree install
-                import cElementTree as etree_
-                XMLParser_import_library = XMLParser_import_elementtree
-                if Verbose_import_:
-                    print("running with cElementTree")
-            except ImportError:
-                try:
-                    # normal ElementTree install
-                    import elementtree.ElementTree as etree_
-                    XMLParser_import_library = XMLParser_import_elementtree
-                    if Verbose_import_:
-                        print("running with ElementTree")
-                except ImportError:
-                    raise ImportError(
-                        "Failed to import ElementTree from any known place")
-
-
-def parsexml_(*args, **kwargs):
-    if (XMLParser_import_library == XMLParser_import_lxml and
-            'parser' not in kwargs):
+
+Validate_simpletypes_ = True
+if sys.version_info.major == 2:
+    BaseStrType_ = basestring
+else:
+    BaseStrType_ = str
+
+
+def parsexml_(infile, parser=None, **kwargs):
+    if parser is None:
         # Use the lxml ElementTree compatible parser so that, e.g.,
         #   we ignore comments.
-        kwargs['parser'] = etree_.ETCompatXMLParser()
-    doc = etree_.parse(*args, **kwargs)
+        parser = etree_.ETCompatXMLParser()
+    doc = etree_.parse(infile, parser=parser, **kwargs)
     return doc
 
 #
@@ -94,61 +66,68 @@ except ImportError as exp:
                 return None
         def gds_format_string(self, input_data, input_name=''):
             return input_data
-        def gds_validate_string(self, input_data, node, input_name=''):
-            return input_data
+        def gds_validate_string(self, input_data, node=None, input_name=''):
+            if not input_data:
+                return ''
+            else:
+                return input_data
         def gds_format_base64(self, input_data, input_name=''):
             return base64.b64encode(input_data)
-        def gds_validate_base64(self, input_data, node, input_name=''):
+        def gds_validate_base64(self, input_data, node=None, input_name=''):
             return input_data
         def gds_format_integer(self, input_data, input_name=''):
             return '%d' % input_data
-        def gds_validate_integer(self, input_data, node, input_name=''):
+        def gds_validate_integer(self, input_data, node=None, input_name=''):
             return input_data
         def gds_format_integer_list(self, input_data, input_name=''):
-            return '%s' % input_data
-        def gds_validate_integer_list(self, input_data, node, input_name=''):
+            return '%s' % ' '.join(input_data)
+        def gds_validate_integer_list(
+                self, input_data, node=None, input_name=''):
             values = input_data.split()
             for value in values:
                 try:
-                    float(value)
+                    int(value)
                 except (TypeError, ValueError):
                     raise_parse_error(node, 'Requires sequence of integers')
-            return input_data
+            return values
         def gds_format_float(self, input_data, input_name=''):
-            return '%f' % input_data
-        def gds_validate_float(self, input_data, node, input_name=''):
+            return ('%.15f' % input_data).rstrip('0')
+        def gds_validate_float(self, input_data, node=None, input_name=''):
             return input_data
         def gds_format_float_list(self, input_data, input_name=''):
-            return '%s' % input_data
-        def gds_validate_float_list(self, input_data, node, input_name=''):
+            return '%s' % ' '.join(input_data)
+        def gds_validate_float_list(
+                self, input_data, node=None, input_name=''):
             values = input_data.split()
             for value in values:
                 try:
                     float(value)
                 except (TypeError, ValueError):
                     raise_parse_error(node, 'Requires sequence of floats')
-            return input_data
+            return values
         def gds_format_double(self, input_data, input_name=''):
             return '%e' % input_data
-        def gds_validate_double(self, input_data, node, input_name=''):
+        def gds_validate_double(self, input_data, node=None, input_name=''):
             return input_data
         def gds_format_double_list(self, input_data, input_name=''):
-            return '%s' % input_data
-        def gds_validate_double_list(self, input_data, node, input_name=''):
+            return '%s' % ' '.join(input_data)
+        def gds_validate_double_list(
+                self, input_data, node=None, input_name=''):
             values = input_data.split()
             for value in values:
                 try:
                     float(value)
                 except (TypeError, ValueError):
                     raise_parse_error(node, 'Requires sequence of doubles')
-            return input_data
+            return values
         def gds_format_boolean(self, input_data, input_name=''):
             return ('%s' % input_data).lower()
-        def gds_validate_boolean(self, input_data, node, input_name=''):
+        def gds_validate_boolean(self, input_data, node=None, input_name=''):
             return input_data
         def gds_format_boolean_list(self, input_data, input_name=''):
-            return '%s' % input_data
-        def gds_validate_boolean_list(self, input_data, node, input_name=''):
+            return '%s' % ' '.join(input_data)
+        def gds_validate_boolean_list(
+                self, input_data, node=None, input_name=''):
             values = input_data.split()
             for value in values:
                 if value not in ('true', '1', 'false', '0', ):
@@ -156,8 +135,8 @@ except ImportError as exp:
                         node,
                         'Requires sequence of booleans '
                         '("true", "1", "false", "0")')
-            return input_data
-        def gds_validate_datetime(self, input_data, node, input_name=''):
+            return values
+        def gds_validate_datetime(self, input_data, node=None, input_name=''):
             return input_data
         def gds_format_datetime(self, input_data, input_name=''):
             if input_data.microsecond == 0:
@@ -199,7 +178,7 @@ except ImportError as exp:
         def gds_parse_datetime(cls, input_data):
             tz = None
             if input_data[-1] == 'Z':
-                tz = GeneratedsSuper._FixedOffsetTZ(0, 'GMT')
+                tz = GeneratedsSuper._FixedOffsetTZ(0, 'UTC')
                 input_data = input_data[:-1]
             else:
                 results = GeneratedsSuper.tzoff_pattern.search(input_data)
@@ -211,7 +190,10 @@ except ImportError as exp:
                     tz = GeneratedsSuper._FixedOffsetTZ(
                         tzoff, results.group(0))
                     input_data = input_data[:-6]
-            if len(input_data.split('.')) > 1:
+            time_parts = input_data.split('.')
+            if len(time_parts) > 1:
+                micro_seconds = int(float('0.' + time_parts[1]) * 1000000)
+                input_data = '%s.%s' % (time_parts[0], micro_seconds, )
                 dt = datetime_.datetime.strptime(
                     input_data, '%Y-%m-%dT%H:%M:%S.%f')
             else:
@@ -219,7 +201,7 @@ except ImportError as exp:
                     input_data, '%Y-%m-%dT%H:%M:%S')
             dt = dt.replace(tzinfo=tz)
             return dt
-        def gds_validate_date(self, input_data, node, input_name=''):
+        def gds_validate_date(self, input_data, node=None, input_name=''):
             return input_data
         def gds_format_date(self, input_data, input_name=''):
             _svalue = '%04d-%02d-%02d' % (
@@ -242,7 +224,8 @@ except ImportError as exp:
                                 _svalue += '+'
                             hours = total_seconds // 3600
                             minutes = (total_seconds - (hours * 3600)) // 60
-                            _svalue += '{0:02d}:{1:02d}'.format(hours, minutes)
+                            _svalue += '{0:02d}:{1:02d}'.format(
+                                hours, minutes)
             except AttributeError:
                 pass
             return _svalue
@@ -250,7 +233,7 @@ except ImportError as exp:
         def gds_parse_date(cls, input_data):
             tz = None
             if input_data[-1] == 'Z':
-                tz = GeneratedsSuper._FixedOffsetTZ(0, 'GMT')
+                tz = GeneratedsSuper._FixedOffsetTZ(0, 'UTC')
                 input_data = input_data[:-1]
             else:
                 results = GeneratedsSuper.tzoff_pattern.search(input_data)
@@ -265,7 +248,7 @@ except ImportError as exp:
             dt = datetime_.datetime.strptime(input_data, '%Y-%m-%d')
             dt = dt.replace(tzinfo=tz)
             return dt.date()
-        def gds_validate_time(self, input_data, node, input_name=''):
+        def gds_validate_time(self, input_data, node=None, input_name=''):
             return input_data
         def gds_format_time(self, input_data, input_name=''):
             if input_data.microsecond == 0:
@@ -297,11 +280,26 @@ except ImportError as exp:
                         minutes = (total_seconds - (hours * 3600)) // 60
                         _svalue += '{0:02d}:{1:02d}'.format(hours, minutes)
             return _svalue
+        def gds_validate_simple_patterns(self, patterns, target):
+            # pat is a list of lists of strings/patterns.  We should:
+            # - AND the outer elements
+            # - OR the inner elements
+            found1 = True
+            for patterns1 in patterns:
+                found2 = False
+                for patterns2 in patterns1:
+                    if re_.search(patterns2, target) is not None:
+                        found2 = True
+                        break
+                if not found2:
+                    found1 = False
+                    break
+            return found1
         @classmethod
         def gds_parse_time(cls, input_data):
             tz = None
             if input_data[-1] == 'Z':
-                tz = GeneratedsSuper._FixedOffsetTZ(0, 'GMT')
+                tz = GeneratedsSuper._FixedOffsetTZ(0, 'UTC')
                 input_data = input_data[:-1]
             else:
                 results = GeneratedsSuper.tzoff_pattern.search(input_data)
@@ -352,6 +350,20 @@ except ImportError as exp:
         @classmethod
         def gds_reverse_node_mapping(cls, mapping):
             return dict(((v, k) for k, v in mapping.iteritems()))
+        @staticmethod
+        def gds_encode(instring):
+            if sys.version_info.major == 2:
+                return instring.encode(ExternalEncoding)
+            else:
+                return instring
+
+    def getSubclassFromModule_(module, class_):
+        '''Get the subclass of a class from a specific module.'''
+        name = class_.__name__ + 'Sub'
+        if hasattr(module, name):
+            return getattr(module, name)
+        else:
+            return None
 
 
 #
@@ -377,6 +389,11 @@ ExternalEncoding = 'ascii'
 Tag_pattern_ = re_.compile(r'({.*})?(.*)')
 String_cleanup_pat_ = re_.compile(r"[\n\r\s]+")
 Namespace_extract_pat_ = re_.compile(r'{(.*)}(.*)')
+CDATA_pattern_ = re_.compile(r"<!\[CDATA\[.*?\]\]>", re_.DOTALL)
+
+# Change this to redirect the generated superclass module to use a
+# specific subclass module.
+CurrentSubclassModule_ = None
 
 #
 # Support/utility functions.
@@ -390,19 +407,32 @@ def showIndent(outfile, level, pretty_print=True):
 
 
 def quote_xml(inStr):
+    "Escape markup chars, but do not modify CDATA sections."
     if not inStr:
         return ''
-    s1 = (isinstance(inStr, basestring) and inStr or
-          '%s' % inStr)
-    s1 = s1.replace('&', '&amp;')
+    s1 = (isinstance(inStr, BaseStrType_) and inStr or '%s' % inStr)
+    s2 = ''
+    pos = 0
+    matchobjects = CDATA_pattern_.finditer(s1)
+    for mo in matchobjects:
+        s3 = s1[pos:mo.start()]
+        s2 += quote_xml_aux(s3)
+        s2 += s1[mo.start():mo.end()]
+        pos = mo.end()
+    s3 = s1[pos:]
+    s2 += quote_xml_aux(s3)
+    return s2
+
+
+def quote_xml_aux(inStr):
+    s1 = inStr.replace('&', '&amp;')
     s1 = s1.replace('<', '&lt;')
     s1 = s1.replace('>', '&gt;')
     return s1
 
 
 def quote_attrib(inStr):
-    s1 = (isinstance(inStr, basestring) and inStr or
-          '%s' % inStr)
+    s1 = (isinstance(inStr, BaseStrType_) and inStr or '%s' % inStr)
     s1 = s1.replace('&', '&amp;')
     s1 = s1.replace('<', '&lt;')
     s1 = s1.replace('>', '&gt;')
@@ -462,11 +492,7 @@ class GDSParseError(Exception):
 
 
 def raise_parse_error(node, msg):
-    if XMLParser_import_library == XMLParser_import_lxml:
-        msg = '%s (element %s/line %d)' % (
-            msg, node.tag, node.sourceline, )
-    else:
-        msg = '%s (element %s)' % (msg, node.tag, )
+    msg = '%s (element %s/line %d)' % (msg, node.tag, node.sourceline, )
     raise GDSParseError(msg)
 
 
@@ -616,11 +642,17 @@ class Annotation(GeneratedsSuper):
     subclass = None
     superclass = None
     def __init__(self, anytypeobjs_=None):
+        self.original_tagname_ = None
         if anytypeobjs_ is None:
             self.anytypeobjs_ = []
         else:
             self.anytypeobjs_ = anytypeobjs_
     def factory(*args_, **kwargs_):
+        if CurrentSubclassModule_ is not None:
+            subclass = getSubclassFromModule_(
+                CurrentSubclassModule_, Annotation)
+            if subclass is not None:
+                return subclass(*args_, **kwargs_)
         if Annotation.subclass:
             return Annotation.subclass(*args_, **kwargs_)
         else:
@@ -642,13 +674,15 @@ class Annotation(GeneratedsSuper):
             eol_ = '\n'
         else:
             eol_ = ''
+        if self.original_tagname_ is not None:
+            name_ = self.original_tagname_
         showIndent(outfile, level, pretty_print)
         outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
         already_processed = set()
         self.exportAttributes(outfile, level, already_processed, namespace_, name_='Annotation')
         if self.hasContent_():
             outfile.write('>%s' % (eol_, ))
-            self.exportChildren(outfile, level + 1, namespace_, name_, pretty_print=pretty_print)
+            self.exportChildren(outfile, level + 1, namespace_='', name_='Annotation', pretty_print=pretty_print)
             showIndent(outfile, level, pretty_print)
             outfile.write('</%s%s>%s' % (namespace_, name_, eol_))
         else:
@@ -662,29 +696,13 @@ class Annotation(GeneratedsSuper):
             eol_ = ''
         for obj_ in self.anytypeobjs_:
             obj_.export(outfile, level, namespace_, pretty_print=pretty_print)
-    def exportLiteral(self, outfile, level, name_='Annotation'):
-        level += 1
-        already_processed = set()
-        self.exportLiteralAttributes(outfile, level, already_processed, name_)
-        if self.hasContent_():
-            self.exportLiteralChildren(outfile, level, name_)
-    def exportLiteralAttributes(self, outfile, level, already_processed, name_):
-        pass
-    def exportLiteralChildren(self, outfile, level, name_):
-        showIndent(outfile, level)
-        outfile.write('anytypeobjs_=[\n')
-        level += 1
-        for anytypeobjs_ in self.anytypeobjs_:
-            anytypeobjs_.exportLiteral(outfile, level)
-        level -= 1
-        showIndent(outfile, level)
-        outfile.write('],\n')
     def build(self, node):
         already_processed = set()
         self.buildAttributes(node, node.attrib, already_processed)
         for child in node:
             nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
             self.buildChildren(child, node, nodeName_)
+        return self
     def buildAttributes(self, node, attrs, already_processed):
         pass
     def buildChildren(self, child_, node, nodeName_, fromsubclass_=False):
@@ -699,15 +717,21 @@ class ComponentType(GeneratedsSuper):
     ComponentType."""
     subclass = None
     superclass = None
-    def __init__(self, extends=None, name=None, description=None, anytypeobjs_=None):
-        self.extends = _cast(None, extends)
+    def __init__(self, name=None, extends=None, description=None, anytypeobjs_=None):
+        self.original_tagname_ = None
         self.name = _cast(None, name)
+        self.extends = _cast(None, extends)
         self.description = _cast(None, description)
         if anytypeobjs_ is None:
             self.anytypeobjs_ = []
         else:
             self.anytypeobjs_ = anytypeobjs_
     def factory(*args_, **kwargs_):
+        if CurrentSubclassModule_ is not None:
+            subclass = getSubclassFromModule_(
+                CurrentSubclassModule_, ComponentType)
+            if subclass is not None:
+                return subclass(*args_, **kwargs_)
         if ComponentType.subclass:
             return ComponentType.subclass(*args_, **kwargs_)
         else:
@@ -717,10 +741,10 @@ class ComponentType(GeneratedsSuper):
     def set_anytypeobjs_(self, anytypeobjs_): self.anytypeobjs_ = anytypeobjs_
     def add_anytypeobjs_(self, value): self.anytypeobjs_.append(value)
     def insert_anytypeobjs_(self, index, value): self._anytypeobjs_[index] = value
-    def get_extends(self): return self.extends
-    def set_extends(self, extends): self.extends = extends
     def get_name(self): return self.name
     def set_name(self, name): self.name = name
+    def get_extends(self): return self.extends
+    def set_extends(self, extends): self.extends = extends
     def get_description(self): return self.description
     def set_description(self, description): self.description = description
     def hasContent_(self):
@@ -735,27 +759,29 @@ class ComponentType(GeneratedsSuper):
             eol_ = '\n'
         else:
             eol_ = ''
+        if self.original_tagname_ is not None:
+            name_ = self.original_tagname_
         showIndent(outfile, level, pretty_print)
         outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
         already_processed = set()
         self.exportAttributes(outfile, level, already_processed, namespace_, name_='ComponentType')
         if self.hasContent_():
             outfile.write('>%s' % (eol_, ))
-            self.exportChildren(outfile, level + 1, namespace_, name_, pretty_print=pretty_print)
+            self.exportChildren(outfile, level + 1, namespace_='', name_='ComponentType', pretty_print=pretty_print)
             showIndent(outfile, level, pretty_print)
             outfile.write('</%s%s>%s' % (namespace_, name_, eol_))
         else:
             outfile.write('/>%s' % (eol_, ))
     def exportAttributes(self, outfile, level, already_processed, namespace_='', name_='ComponentType'):
-        if self.extends is not None and 'extends' not in already_processed:
-            already_processed.add('extends')
-            outfile.write(' extends=%s' % (self.gds_format_string(quote_attrib(self.extends).encode(ExternalEncoding), input_name='extends'), ))
         if self.name is not None and 'name' not in already_processed:
             already_processed.add('name')
-            outfile.write(' name=%s' % (self.gds_format_string(quote_attrib(self.name).encode(ExternalEncoding), input_name='name'), ))
+            outfile.write(' name=%s' % (self.gds_encode(self.gds_format_string(quote_attrib(self.name), input_name='name')), ))
+        if self.extends is not None and 'extends' not in already_processed:
+            already_processed.add('extends')
+            outfile.write(' extends=%s' % (self.gds_encode(self.gds_format_string(quote_attrib(self.extends), input_name='extends')), ))
         if self.description is not None and 'description' not in already_processed:
             already_processed.add('description')
-            outfile.write(' description=%s' % (self.gds_format_string(quote_attrib(self.description).encode(ExternalEncoding), input_name='description'), ))
+            outfile.write(' description=%s' % (self.gds_encode(self.gds_format_string(quote_attrib(self.description), input_name='description')), ))
     def exportChildren(self, outfile, level, namespace_='', name_='ComponentType', fromsubclass_=False, pretty_print=True):
         if pretty_print:
             eol_ = '\n'
@@ -763,49 +789,22 @@ class ComponentType(GeneratedsSuper):
             eol_ = ''
         for obj_ in self.anytypeobjs_:
             obj_.export(outfile, level, namespace_, pretty_print=pretty_print)
-    def exportLiteral(self, outfile, level, name_='ComponentType'):
-        level += 1
-        already_processed = set()
-        self.exportLiteralAttributes(outfile, level, already_processed, name_)
-        if self.hasContent_():
-            self.exportLiteralChildren(outfile, level, name_)
-    def exportLiteralAttributes(self, outfile, level, already_processed, name_):
-        if self.extends is not None and 'extends' not in already_processed:
-            already_processed.add('extends')
-            showIndent(outfile, level)
-            outfile.write('extends="%s",\n' % (self.extends,))
-        if self.name is not None and 'name' not in already_processed:
-            already_processed.add('name')
-            showIndent(outfile, level)
-            outfile.write('name="%s",\n' % (self.name,))
-        if self.description is not None and 'description' not in already_processed:
-            already_processed.add('description')
-            showIndent(outfile, level)
-            outfile.write('description="%s",\n' % (self.description,))
-    def exportLiteralChildren(self, outfile, level, name_):
-        showIndent(outfile, level)
-        outfile.write('anytypeobjs_=[\n')
-        level += 1
-        for anytypeobjs_ in self.anytypeobjs_:
-            anytypeobjs_.exportLiteral(outfile, level)
-        level -= 1
-        showIndent(outfile, level)
-        outfile.write('],\n')
     def build(self, node):
         already_processed = set()
         self.buildAttributes(node, node.attrib, already_processed)
         for child in node:
             nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
             self.buildChildren(child, node, nodeName_)
+        return self
     def buildAttributes(self, node, attrs, already_processed):
-        value = find_attr_value_('extends', node)
-        if value is not None and 'extends' not in already_processed:
-            already_processed.add('extends')
-            self.extends = value
         value = find_attr_value_('name', node)
         if value is not None and 'name' not in already_processed:
             already_processed.add('name')
             self.name = value
+        value = find_attr_value_('extends', node)
+        if value is not None and 'extends' not in already_processed:
+            already_processed.add('extends')
+            self.extends = value
         value = find_attr_value_('description', node)
         if value is not None and 'description' not in already_processed:
             already_processed.add('description')
@@ -821,6 +820,7 @@ class IncludeType(GeneratedsSuper):
     subclass = None
     superclass = None
     def __init__(self, href=None, valueOf_=None, mixedclass_=None, content_=None):
+        self.original_tagname_ = None
         self.href = _cast(None, href)
         self.valueOf_ = valueOf_
         if mixedclass_ is None:
@@ -833,6 +833,11 @@ class IncludeType(GeneratedsSuper):
             self.content_ = content_
         self.valueOf_ = valueOf_
     def factory(*args_, **kwargs_):
+        if CurrentSubclassModule_ is not None:
+            subclass = getSubclassFromModule_(
+                CurrentSubclassModule_, IncludeType)
+            if subclass is not None:
+                return subclass(*args_, **kwargs_)
         if IncludeType.subclass:
             return IncludeType.subclass(*args_, **kwargs_)
         else:
@@ -844,7 +849,7 @@ class IncludeType(GeneratedsSuper):
     def set_valueOf_(self, valueOf_): self.valueOf_ = valueOf_
     def hasContent_(self):
         if (
-            self.valueOf_
+            1 if type(self.valueOf_) in [int,float] else self.valueOf_
         ):
             return True
         else:
@@ -854,6 +859,8 @@ class IncludeType(GeneratedsSuper):
             eol_ = '\n'
         else:
             eol_ = ''
+        if self.original_tagname_ is not None:
+            name_ = self.original_tagname_
         showIndent(outfile, level, pretty_print)
         outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
         already_processed = set()
@@ -864,24 +871,9 @@ class IncludeType(GeneratedsSuper):
     def exportAttributes(self, outfile, level, already_processed, namespace_='', name_='IncludeType'):
         if self.href is not None and 'href' not in already_processed:
             already_processed.add('href')
-            outfile.write(' href=%s' % (self.gds_format_string(quote_attrib(self.href).encode(ExternalEncoding), input_name='href'), ))
+            outfile.write(' href=%s' % (self.gds_encode(self.gds_format_string(quote_attrib(self.href), input_name='href')), ))
     def exportChildren(self, outfile, level, namespace_='', name_='IncludeType', fromsubclass_=False, pretty_print=True):
         pass
-    def exportLiteral(self, outfile, level, name_='IncludeType'):
-        level += 1
-        already_processed = set()
-        self.exportLiteralAttributes(outfile, level, already_processed, name_)
-        if self.hasContent_():
-            self.exportLiteralChildren(outfile, level, name_)
-        showIndent(outfile, level)
-        outfile.write('valueOf_ = """%s""",\n' % (self.valueOf_,))
-    def exportLiteralAttributes(self, outfile, level, already_processed, name_):
-        if self.href is not None and 'href' not in already_processed:
-            already_processed.add('href')
-            showIndent(outfile, level)
-            outfile.write('href="%s",\n' % (self.href,))
-    def exportLiteralChildren(self, outfile, level, name_):
-        pass
     def build(self, node):
         already_processed = set()
         self.buildAttributes(node, node.attrib, already_processed)
@@ -893,6 +885,7 @@ class IncludeType(GeneratedsSuper):
         for child in node:
             nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
             self.buildChildren(child, node, nodeName_)
+        return self
     def buildAttributes(self, node, attrs, already_processed):
         value = find_attr_value_('href', node)
         if value is not None and 'href' not in already_processed:
@@ -910,35 +903,52 @@ class IncludeType(GeneratedsSuper):
 class Q10Settings(GeneratedsSuper):
     subclass = None
     superclass = None
-    def __init__(self, fixedQ10=None, experimentalTemp=None, type_=None, q10Factor=None):
-        self.fixedQ10 = _cast(None, fixedQ10)
-        self.experimentalTemp = _cast(None, experimentalTemp)
+    def __init__(self, type_=None, fixedQ10=None, q10Factor=None, experimentalTemp=None):
+        self.original_tagname_ = None
         self.type_ = _cast(None, type_)
+        self.fixedQ10 = _cast(None, fixedQ10)
         self.q10Factor = _cast(None, q10Factor)
-        pass
+        self.experimentalTemp = _cast(None, experimentalTemp)
     def factory(*args_, **kwargs_):
+        if CurrentSubclassModule_ is not None:
+            subclass = getSubclassFromModule_(
+                CurrentSubclassModule_, Q10Settings)
+            if subclass is not None:
+                return subclass(*args_, **kwargs_)
         if Q10Settings.subclass:
             return Q10Settings.subclass(*args_, **kwargs_)
         else:
             return Q10Settings(*args_, **kwargs_)
     factory = staticmethod(factory)
+    def get_type(self): return self.type_
+    def set_type(self, type_): self.type_ = type_
     def get_fixedQ10(self): return self.fixedQ10
     def set_fixedQ10(self, fixedQ10): self.fixedQ10 = fixedQ10
-    def validate_Nml2Quantity_none(self, value):
-        # Validate type Nml2Quantity_none, a restriction on xs:string.
-        pass
+    def get_q10Factor(self): return self.q10Factor
+    def set_q10Factor(self, q10Factor): self.q10Factor = q10Factor
     def get_experimentalTemp(self): return self.experimentalTemp
     def set_experimentalTemp(self, experimentalTemp): self.experimentalTemp = experimentalTemp
-    def validate_Nml2Quantity_temperature(self, value):
-        # Validate type Nml2Quantity_temperature, a restriction on xs:string.
-        pass
-    def get_type(self): return self.type_
-    def set_type(self, type_): self.type_ = type_
     def validate_NmlId(self, value):
         # Validate type NmlId, a restriction on xs:string.
-        pass
-    def get_q10Factor(self): return self.q10Factor
-    def set_q10Factor(self, q10Factor): self.q10Factor = q10Factor
+        if value is not None and Validate_simpletypes_:
+            if not self.gds_validate_simple_patterns(
+                    self.validate_NmlId_patterns_, value):
+                warnings_.warn('Value "%s" does not match xsd pattern restrictions: %s' % (value.encode('utf-8'), self.validate_NmlId_patterns_, ))
+    validate_NmlId_patterns_ = [['^[a-zA-Z0-9_]*$']]
+    def validate_Nml2Quantity_none(self, value):
+        # Validate type Nml2Quantity_none, a restriction on xs:string.
+        if value is not None and Validate_simpletypes_:
+            if not self.gds_validate_simple_patterns(
+                    self.validate_Nml2Quantity_none_patterns_, value):
+                warnings_.warn('Value "%s" does not match xsd pattern restrictions: %s' % (value.encode('utf-8'), self.validate_Nml2Quantity_none_patterns_, ))
+    validate_Nml2Quantity_none_patterns_ = [['^-?([0-9]*(\\.[0-9]+)?)([eE]-?[0-9]+)?$']]
+    def validate_Nml2Quantity_temperature(self, value):
+        # Validate type Nml2Quantity_temperature, a restriction on xs:string.
+        if value is not None and Validate_simpletypes_:
+            if not self.gds_validate_simple_patterns(
+                    self.validate_Nml2Quantity_temperature_patterns_, value):
+                warnings_.warn('Value "%s" does not match xsd pattern restrictions: %s' % (value.encode('utf-8'), self.validate_Nml2Quantity_temperature_patterns_, ))
+    validate_Nml2Quantity_temperature_patterns_ = [['^-?([0-9]*(\\.[0-9]+)?)([eE]-?[0-9]+)?[\\s]*(degC)$']]
     def hasContent_(self):
         if (
 
@@ -951,55 +961,32 @@ class Q10Settings(GeneratedsSuper):
             eol_ = '\n'
         else:
             eol_ = ''
+        if self.original_tagname_ is not None:
+            name_ = self.original_tagname_
         showIndent(outfile, level, pretty_print)
         outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
         already_processed = set()
         self.exportAttributes(outfile, level, already_processed, namespace_, name_='Q10Settings')
         if self.hasContent_():
             outfile.write('>%s' % (eol_, ))
-            self.exportChildren(outfile, level + 1, namespace_, name_, pretty_print=pretty_print)
+            self.exportChildren(outfile, level + 1, namespace_='', name_='Q10Settings', pretty_print=pretty_print)
             outfile.write('</%s%s>%s' % (namespace_, name_, eol_))
         else:
             outfile.write('/>%s' % (eol_, ))
     def exportAttributes(self, outfile, level, already_processed, namespace_='', name_='Q10Settings'):
-        if self.fixedQ10 is not None and 'fixedQ10' not in already_processed:
-            already_processed.add('fixedQ10')
-            outfile.write(' fixedQ10=%s' % (quote_attrib(self.fixedQ10), ))
-        if self.experimentalTemp is not None and 'experimentalTemp' not in already_processed:
-            already_processed.add('experimentalTemp')
-            outfile.write(' experimentalTemp=%s' % (quote_attrib(self.experimentalTemp), ))
         if self.type_ is not None and 'type_' not in already_processed:
             already_processed.add('type_')
             outfile.write(' type=%s' % (quote_attrib(self.type_), ))
+        if self.fixedQ10 is not None and 'fixedQ10' not in already_processed:
+            already_processed.add('fixedQ10')
+            outfile.write(' fixedQ10=%s' % (quote_attrib(self.fixedQ10), ))
         if self.q10Factor is not None and 'q10Factor' not in already_processed:
             already_processed.add('q10Factor')
             outfile.write(' q10Factor=%s' % (quote_attrib(self.q10Factor), ))
-    def exportChildren(self, outfile, level, namespace_='', name_='Q10Settings', fromsubclass_=False, pretty_print=True):
-        pass
-    def exportLiteral(self, outfile, level, name_='Q10Settings'):
-        level += 1
-        already_processed = set()
-        self.exportLiteralAttributes(outfile, level, already_processed, name_)
-        if self.hasContent_():
-            self.exportLiteralChildren(outfile, level, name_)
-    def exportLiteralAttributes(self, outfile, level, already_processed, name_):
-        if self.fixedQ10 is not None and 'fixedQ10' not in already_processed:
-            already_processed.add('fixedQ10')
-            showIndent(outfile, level)
-            outfile.write('fixedQ10="%s",\n' % (self.fixedQ10,))
         if self.experimentalTemp is not None and 'experimentalTemp' not in already_processed:
             already_processed.add('experimentalTemp')
-            showIndent(outfile, level)
-            outfile.write('experimentalTemp="%s",\n' % (self.experimentalTemp,))
-        if self.type_ is not None and 'type_' not in already_processed:
-            already_processed.add('type_')
-            showIndent(outfile, level)
-            outfile.write('type_="%s",\n' % (self.type_,))
-        if self.q10Factor is not None and 'q10Factor' not in already_processed:
-            already_processed.add('q10Factor')
-            showIndent(outfile, level)
-            outfile.write('q10Factor="%s",\n' % (self.q10Factor,))
-    def exportLiteralChildren(self, outfile, level, name_):
+            outfile.write(' experimentalTemp=%s' % (quote_attrib(self.experimentalTemp), ))
+    def exportChildren(self, outfile, level, namespace_='', name_='Q10Settings', fromsubclass_=False, pretty_print=True):
         pass
     def build(self, node):
         already_processed = set()
@@ -1007,27 +994,28 @@ class Q10Settings(GeneratedsSuper):
         for child in node:
             nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
             self.buildChildren(child, node, nodeName_)
+        return self
     def buildAttributes(self, node, attrs, already_processed):
-        value = find_attr_value_('fixedQ10', node)
-        if value is not None and 'fixedQ10' not in already_processed:
-            already_processed.add('fixedQ10')
-            self.fixedQ10 = value
-            self.validate_Nml2Quantity_none(self.fixedQ10)    # validate type Nml2Quantity_none
-        value = find_attr_value_('experimentalTemp', node)
-        if value is not None and 'experimentalTemp' not in already_processed:
-            already_processed.add('experimentalTemp')
-            self.experimentalTemp = value
-            self.validate_Nml2Quantity_temperature(self.experimentalTemp)    # validate type Nml2Quantity_temperature
         value = find_attr_value_('type', node)
         if value is not None and 'type' not in already_processed:
             already_processed.add('type')
             self.type_ = value
             self.validate_NmlId(self.type_)    # validate type NmlId
+        value = find_attr_value_('fixedQ10', node)
+        if value is not None and 'fixedQ10' not in already_processed:
+            already_processed.add('fixedQ10')
+            self.fixedQ10 = value
+            self.validate_Nml2Quantity_none(self.fixedQ10)    # validate type Nml2Quantity_none
         value = find_attr_value_('q10Factor', node)
         if value is not None and 'q10Factor' not in already_processed:
             already_processed.add('q10Factor')
             self.q10Factor = value
             self.validate_Nml2Quantity_none(self.q10Factor)    # validate type Nml2Quantity_none
+        value = find_attr_value_('experimentalTemp', node)
+        if value is not None and 'experimentalTemp' not in already_processed:
+            already_processed.add('experimentalTemp')
+            self.experimentalTemp = value
+            self.validate_Nml2Quantity_temperature(self.experimentalTemp)    # validate type Nml2Quantity_temperature
     def buildChildren(self, child_, node, nodeName_, fromsubclass_=False):
         pass
 # end class Q10Settings
@@ -1036,35 +1024,52 @@ class Q10Settings(GeneratedsSuper):
 class HHRate(GeneratedsSuper):
     subclass = None
     superclass = None
-    def __init__(self, midpoint=None, rate=None, scale=None, type_=None):
-        self.midpoint = _cast(None, midpoint)
+    def __init__(self, type_=None, rate=None, midpoint=None, scale=None):
+        self.original_tagname_ = None
+        self.type_ = _cast(None, type_)
         self.rate = _cast(None, rate)
+        self.midpoint = _cast(None, midpoint)
         self.scale = _cast(None, scale)
-        self.type_ = _cast(None, type_)
-        pass
     def factory(*args_, **kwargs_):
+        if CurrentSubclassModule_ is not None:
+            subclass = getSubclassFromModule_(
+                CurrentSubclassModule_, HHRate)
+            if subclass is not None:
+                return subclass(*args_, **kwargs_)
         if HHRate.subclass:
             return HHRate.subclass(*args_, **kwargs_)
         else:
             return HHRate(*args_, **kwargs_)
     factory = staticmethod(factory)
-    def get_midpoint(self): return self.midpoint
-    def set_midpoint(self, midpoint): self.midpoint = midpoint
-    def validate_Nml2Quantity_voltage(self, value):
-        # Validate type Nml2Quantity_voltage, a restriction on xs:string.
-        pass
+    def get_type(self): return self.type_
+    def set_type(self, type_): self.type_ = type_
     def get_rate(self): return self.rate
     def set_rate(self, rate): self.rate = rate
-    def validate_Nml2Quantity_pertime(self, value):
-        # Validate type Nml2Quantity_pertime, a restriction on xs:string.
-        pass
+    def get_midpoint(self): return self.midpoint
+    def set_midpoint(self, midpoint): self.midpoint = midpoint
     def get_scale(self): return self.scale
     def set_scale(self, scale): self.scale = scale
-    def get_type(self): return self.type_
-    def set_type(self, type_): self.type_ = type_
     def validate_NmlId(self, value):
         # Validate type NmlId, a restriction on xs:string.
-        pass
+        if value is not None and Validate_simpletypes_:
+            if not self.gds_validate_simple_patterns(
+                    self.validate_NmlId_patterns_, value):
+                warnings_.warn('Value "%s" does not match xsd pattern restrictions: %s' % (value.encode('utf-8'), self.validate_NmlId_patterns_, ))
+    validate_NmlId_patterns_ = [['^[a-zA-Z0-9_]*$']]
+    def validate_Nml2Quantity_pertime(self, value):
+        # Validate type Nml2Quantity_pertime, a restriction on xs:string.
+        if value is not None and Validate_simpletypes_:
+            if not self.gds_validate_simple_patterns(
+                    self.validate_Nml2Quantity_pertime_patterns_, value):
+                warnings_.warn('Value "%s" does not match xsd pattern restrictions: %s' % (value.encode('utf-8'), self.validate_Nml2Quantity_pertime_patterns_, ))
+    validate_Nml2Quantity_pertime_patterns_ = [['^-?([0-9]*(\\.[0-9]+)?)([eE]-?[0-9]+)?[\\s]*(per_s|per_ms|Hz)$']]
+    def validate_Nml2Quantity_voltage(self, value):
+        # Validate type Nml2Quantity_voltage, a restriction on xs:string.
+        if value is not None and Validate_simpletypes_:
+            if not self.gds_validate_simple_patterns(
+                    self.validate_Nml2Quantity_voltage_patterns_, value):
+                warnings_.warn('Value "%s" does not match xsd pattern restrictions: %s' % (value.encode('utf-8'), self.validate_Nml2Quantity_voltage_patterns_, ))
+    validate_Nml2Quantity_voltage_patterns_ = [['^-?([0-9]*(\\.[0-9]+)?)([eE]-?[0-9]+)?[\\s]*(V|mV)$']]
     def hasContent_(self):
         if (
 
@@ -1077,55 +1082,32 @@ class HHRate(GeneratedsSuper):
             eol_ = '\n'
         else:
             eol_ = ''
+        if self.original_tagname_ is not None:
+            name_ = self.original_tagname_
         showIndent(outfile, level, pretty_print)
         outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
         already_processed = set()
         self.exportAttributes(outfile, level, already_processed, namespace_, name_='HHRate')
         if self.hasContent_():
             outfile.write('>%s' % (eol_, ))
-            self.exportChildren(outfile, level + 1, namespace_, name_, pretty_print=pretty_print)
+            self.exportChildren(outfile, level + 1, namespace_='', name_='HHRate', pretty_print=pretty_print)
             outfile.write('</%s%s>%s' % (namespace_, name_, eol_))
         else:
             outfile.write('/>%s' % (eol_, ))
     def exportAttributes(self, outfile, level, already_processed, namespace_='', name_='HHRate'):
-        if self.midpoint is not None and 'midpoint' not in already_processed:
-            already_processed.add('midpoint')
-            outfile.write(' midpoint=%s' % (quote_attrib(self.midpoint), ))
-        if self.rate is not None and 'rate' not in already_processed:
-            already_processed.add('rate')
-            outfile.write(' rate=%s' % (quote_attrib(self.rate), ))
-        if self.scale is not None and 'scale' not in already_processed:
-            already_processed.add('scale')
-            outfile.write(' scale=%s' % (quote_attrib(self.scale), ))
         if self.type_ is not None and 'type_' not in already_processed:
             already_processed.add('type_')
             outfile.write(' type=%s' % (quote_attrib(self.type_), ))
-    def exportChildren(self, outfile, level, namespace_='', name_='HHRate', fromsubclass_=False, pretty_print=True):
-        pass
-    def exportLiteral(self, outfile, level, name_='HHRate'):
-        level += 1
-        already_processed = set()
-        self.exportLiteralAttributes(outfile, level, already_processed, name_)
-        if self.hasContent_():
-            self.exportLiteralChildren(outfile, level, name_)
-    def exportLiteralAttributes(self, outfile, level, already_processed, name_):
-        if self.midpoint is not None and 'midpoint' not in already_processed:
-            already_processed.add('midpoint')
-            showIndent(outfile, level)
-            outfile.write('midpoint="%s",\n' % (self.midpoint,))
         if self.rate is not None and 'rate' not in already_processed:
             already_processed.add('rate')
-            showIndent(outfile, level)
-            outfile.write('rate="%s",\n' % (self.rate,))
+            outfile.write(' rate=%s' % (quote_attrib(self.rate), ))
+        if self.midpoint is not None and 'midpoint' not in already_processed:
+            already_processed.add('midpoint')
+            outfile.write(' midpoint=%s' % (quote_attrib(self.midpoint), ))
         if self.scale is not None and 'scale' not in already_processed:
             already_processed.add('scale')
-            showIndent(outfile, level)
-            outfile.write('scale="%s",\n' % (self.scale,))
-        if self.type_ is not None and 'type_' not in already_processed:
-            already_processed.add('type_')
-            showIndent(outfile, level)
-            outfile.write('type_="%s",\n' % (self.type_,))
-    def exportLiteralChildren(self, outfile, level, name_):
+            outfile.write(' scale=%s' % (quote_attrib(self.scale), ))
+    def exportChildren(self, outfile, level, namespace_='', name_='HHRate', fromsubclass_=False, pretty_print=True):
         pass
     def build(self, node):
         already_processed = set()
@@ -1133,27 +1115,28 @@ class HHRate(GeneratedsSuper):
         for child in node:
             nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
             self.buildChildren(child, node, nodeName_)
+        return self
     def buildAttributes(self, node, attrs, already_processed):
-        value = find_attr_value_('midpoint', node)
-        if value is not None and 'midpoint' not in already_processed:
-            already_processed.add('midpoint')
-            self.midpoint = value
-            self.validate_Nml2Quantity_voltage(self.midpoint)    # validate type Nml2Quantity_voltage
+        value = find_attr_value_('type', node)
+        if value is not None and 'type' not in already_processed:
+            already_processed.add('type')
+            self.type_ = value
+            self.validate_NmlId(self.type_)    # validate type NmlId
         value = find_attr_value_('rate', node)
         if value is not None and 'rate' not in already_processed:
             already_processed.add('rate')
             self.rate = value
             self.validate_Nml2Quantity_pertime(self.rate)    # validate type Nml2Quantity_pertime
+        value = find_attr_value_('midpoint', node)
+        if value is not None and 'midpoint' not in already_processed:
+            already_processed.add('midpoint')
+            self.midpoint = value
+            self.validate_Nml2Quantity_voltage(self.midpoint)    # validate type Nml2Quantity_voltage
         value = find_attr_value_('scale', node)
         if value is not None and 'scale' not in already_processed:
             already_processed.add('scale')
             self.scale = value
             self.validate_Nml2Quantity_voltage(self.scale)    # validate type Nml2Quantity_voltage
-        value = find_attr_value_('type', node)
-        if value is not None and 'type' not in already_processed:
-            already_processed.add('type')
-            self.type_ = value
-            self.validate_NmlId(self.type_)    # validate type NmlId
     def buildChildren(self, child_, node, nodeName_, fromsubclass_=False):
         pass
 # end class HHRate
@@ -1162,32 +1145,45 @@ class HHRate(GeneratedsSuper):
 class HHVariable(GeneratedsSuper):
     subclass = None
     superclass = None
-    def __init__(self, midpoint=None, rate=None, scale=None, type_=None):
-        self.midpoint = _cast(None, midpoint)
+    def __init__(self, type_=None, rate=None, midpoint=None, scale=None):
+        self.original_tagname_ = None
+        self.type_ = _cast(None, type_)
         self.rate = _cast(float, rate)
+        self.midpoint = _cast(None, midpoint)
         self.scale = _cast(None, scale)
-        self.type_ = _cast(None, type_)
-        pass
     def factory(*args_, **kwargs_):
+        if CurrentSubclassModule_ is not None:
+            subclass = getSubclassFromModule_(
+                CurrentSubclassModule_, HHVariable)
+            if subclass is not None:
+                return subclass(*args_, **kwargs_)
         if HHVariable.subclass:
             return HHVariable.subclass(*args_, **kwargs_)
         else:
             return HHVariable(*args_, **kwargs_)
     factory = staticmethod(factory)
-    def get_midpoint(self): return self.midpoint
-    def set_midpoint(self, midpoint): self.midpoint = midpoint
-    def validate_Nml2Quantity_voltage(self, value):
-        # Validate type Nml2Quantity_voltage, a restriction on xs:string.
-        pass
+    def get_type(self): return self.type_
+    def set_type(self, type_): self.type_ = type_
     def get_rate(self): return self.rate
     def set_rate(self, rate): self.rate = rate
+    def get_midpoint(self): return self.midpoint
+    def set_midpoint(self, midpoint): self.midpoint = midpoint
     def get_scale(self): return self.scale
     def set_scale(self, scale): self.scale = scale
-    def get_type(self): return self.type_
-    def set_type(self, type_): self.type_ = type_
     def validate_NmlId(self, value):
         # Validate type NmlId, a restriction on xs:string.
-        pass
+        if value is not None and Validate_simpletypes_:
+            if not self.gds_validate_simple_patterns(
+                    self.validate_NmlId_patterns_, value):
+                warnings_.warn('Value "%s" does not match xsd pattern restrictions: %s' % (value.encode('utf-8'), self.validate_NmlId_patterns_, ))
+    validate_NmlId_patterns_ = [['^[a-zA-Z0-9_]*$']]
+    def validate_Nml2Quantity_voltage(self, value):
+        # Validate type Nml2Quantity_voltage, a restriction on xs:string.
+        if value is not None and Validate_simpletypes_:
+            if not self.gds_validate_simple_patterns(
+                    self.validate_Nml2Quantity_voltage_patterns_, value):
+                warnings_.warn('Value "%s" does not match xsd pattern restrictions: %s' % (value.encode('utf-8'), self.validate_Nml2Quantity_voltage_patterns_, ))
+    validate_Nml2Quantity_voltage_patterns_ = [['^-?([0-9]*(\\.[0-9]+)?)([eE]-?[0-9]+)?[\\s]*(V|mV)$']]
     def hasContent_(self):
         if (
 
@@ -1200,55 +1196,32 @@ class HHVariable(GeneratedsSuper):
             eol_ = '\n'
         else:
             eol_ = ''
+        if self.original_tagname_ is not None:
+            name_ = self.original_tagname_
         showIndent(outfile, level, pretty_print)
         outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
         already_processed = set()
         self.exportAttributes(outfile, level, already_processed, namespace_, name_='HHVariable')
         if self.hasContent_():
             outfile.write('>%s' % (eol_, ))
-            self.exportChildren(outfile, level + 1, namespace_, name_, pretty_print=pretty_print)
+            self.exportChildren(outfile, level + 1, namespace_='', name_='HHVariable', pretty_print=pretty_print)
             outfile.write('</%s%s>%s' % (namespace_, name_, eol_))
         else:
             outfile.write('/>%s' % (eol_, ))
     def exportAttributes(self, outfile, level, already_processed, namespace_='', name_='HHVariable'):
-        if self.midpoint is not None and 'midpoint' not in already_processed:
-            already_processed.add('midpoint')
-            outfile.write(' midpoint=%s' % (quote_attrib(self.midpoint), ))
-        if self.rate is not None and 'rate' not in already_processed:
-            already_processed.add('rate')
-            outfile.write(' rate="%s"' % self.gds_format_float(self.rate, input_name='rate'))
-        if self.scale is not None and 'scale' not in already_processed:
-            already_processed.add('scale')
-            outfile.write(' scale=%s' % (quote_attrib(self.scale), ))
         if self.type_ is not None and 'type_' not in already_processed:
             already_processed.add('type_')
             outfile.write(' type=%s' % (quote_attrib(self.type_), ))
-    def exportChildren(self, outfile, level, namespace_='', name_='HHVariable', fromsubclass_=False, pretty_print=True):
-        pass
-    def exportLiteral(self, outfile, level, name_='HHVariable'):
-        level += 1
-        already_processed = set()
-        self.exportLiteralAttributes(outfile, level, already_processed, name_)
-        if self.hasContent_():
-            self.exportLiteralChildren(outfile, level, name_)
-    def exportLiteralAttributes(self, outfile, level, already_processed, name_):
-        if self.midpoint is not None and 'midpoint' not in already_processed:
-            already_processed.add('midpoint')
-            showIndent(outfile, level)
-            outfile.write('midpoint="%s",\n' % (self.midpoint,))
         if self.rate is not None and 'rate' not in already_processed:
             already_processed.add('rate')
-            showIndent(outfile, level)
-            outfile.write('rate=%f,\n' % (self.rate,))
+            outfile.write(' rate="%s"' % self.gds_format_float(self.rate, input_name='rate'))
+        if self.midpoint is not None and 'midpoint' not in already_processed:
+            already_processed.add('midpoint')
+            outfile.write(' midpoint=%s' % (quote_attrib(self.midpoint), ))
         if self.scale is not None and 'scale' not in already_processed:
             already_processed.add('scale')
-            showIndent(outfile, level)
-            outfile.write('scale="%s",\n' % (self.scale,))
-        if self.type_ is not None and 'type_' not in already_processed:
-            already_processed.add('type_')
-            showIndent(outfile, level)
-            outfile.write('type_="%s",\n' % (self.type_,))
-    def exportLiteralChildren(self, outfile, level, name_):
+            outfile.write(' scale=%s' % (quote_attrib(self.scale), ))
+    def exportChildren(self, outfile, level, namespace_='', name_='HHVariable', fromsubclass_=False, pretty_print=True):
         pass
     def build(self, node):
         already_processed = set()
@@ -1256,29 +1229,30 @@ class HHVariable(GeneratedsSuper):
         for child in node:
             nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
             self.buildChildren(child, node, nodeName_)
+        return self
     def buildAttributes(self, node, attrs, already_processed):
-        value = find_attr_value_('midpoint', node)
-        if value is not None and 'midpoint' not in already_processed:
-            already_processed.add('midpoint')
-            self.midpoint = value
-            self.validate_Nml2Quantity_voltage(self.midpoint)    # validate type Nml2Quantity_voltage
-        value = find_attr_value_('rate', node)
+        value = find_attr_value_('type', node)
+        if value is not None and 'type' not in already_processed:
+            already_processed.add('type')
+            self.type_ = value
+            self.validate_NmlId(self.type_)    # validate type NmlId
+        value = find_attr_value_('rate', node)
         if value is not None and 'rate' not in already_processed:
             already_processed.add('rate')
             try:
                 self.rate = float(value)
             except ValueError as exp:
                 raise ValueError('Bad float/double attribute (rate): %s' % exp)
+        value = find_attr_value_('midpoint', node)
+        if value is not None and 'midpoint' not in already_processed:
+            already_processed.add('midpoint')
+            self.midpoint = value
+            self.validate_Nml2Quantity_voltage(self.midpoint)    # validate type Nml2Quantity_voltage
         value = find_attr_value_('scale', node)
         if value is not None and 'scale' not in already_processed:
             already_processed.add('scale')
             self.scale = value
             self.validate_Nml2Quantity_voltage(self.scale)    # validate type Nml2Quantity_voltage
-        value = find_attr_value_('type', node)
-        if value is not None and 'type' not in already_processed:
-            already_processed.add('type')
-            self.type_ = value
-            self.validate_NmlId(self.type_)    # validate type NmlId
     def buildChildren(self, child_, node, nodeName_, fromsubclass_=False):
         pass
 # end class HHVariable
@@ -1287,38 +1261,55 @@ class HHVariable(GeneratedsSuper):
 class HHTime(GeneratedsSuper):
     subclass = None
     superclass = None
-    def __init__(self, midpoint=None, rate=None, scale=None, type_=None, tau=None):
-        self.midpoint = _cast(None, midpoint)
+    def __init__(self, type_=None, rate=None, midpoint=None, scale=None, tau=None):
+        self.original_tagname_ = None
+        self.type_ = _cast(None, type_)
         self.rate = _cast(None, rate)
+        self.midpoint = _cast(None, midpoint)
         self.scale = _cast(None, scale)
-        self.type_ = _cast(None, type_)
         self.tau = _cast(None, tau)
-        pass
     def factory(*args_, **kwargs_):
+        if CurrentSubclassModule_ is not None:
+            subclass = getSubclassFromModule_(
+                CurrentSubclassModule_, HHTime)
+            if subclass is not None:
+                return subclass(*args_, **kwargs_)
         if HHTime.subclass:
             return HHTime.subclass(*args_, **kwargs_)
         else:
             return HHTime(*args_, **kwargs_)
     factory = staticmethod(factory)
-    def get_midpoint(self): return self.midpoint
-    def set_midpoint(self, midpoint): self.midpoint = midpoint
-    def validate_Nml2Quantity_voltage(self, value):
-        # Validate type Nml2Quantity_voltage, a restriction on xs:string.
-        pass
+    def get_type(self): return self.type_
+    def set_type(self, type_): self.type_ = type_
     def get_rate(self): return self.rate
     def set_rate(self, rate): self.rate = rate
-    def validate_Nml2Quantity_time(self, value):
-        # Validate type Nml2Quantity_time, a restriction on xs:string.
-        pass
+    def get_midpoint(self): return self.midpoint
+    def set_midpoint(self, midpoint): self.midpoint = midpoint
     def get_scale(self): return self.scale
     def set_scale(self, scale): self.scale = scale
-    def get_type(self): return self.type_
-    def set_type(self, type_): self.type_ = type_
-    def validate_NmlId(self, value):
-        # Validate type NmlId, a restriction on xs:string.
-        pass
     def get_tau(self): return self.tau
     def set_tau(self, tau): self.tau = tau
+    def validate_NmlId(self, value):
+        # Validate type NmlId, a restriction on xs:string.
+        if value is not None and Validate_simpletypes_:
+            if not self.gds_validate_simple_patterns(
+                    self.validate_NmlId_patterns_, value):
+                warnings_.warn('Value "%s" does not match xsd pattern restrictions: %s' % (value.encode('utf-8'), self.validate_NmlId_patterns_, ))
+    validate_NmlId_patterns_ = [['^[a-zA-Z0-9_]*$']]
+    def validate_Nml2Quantity_time(self, value):
+        # Validate type Nml2Quantity_time, a restriction on xs:string.
+        if value is not None and Validate_simpletypes_:
+            if not self.gds_validate_simple_patterns(
+                    self.validate_Nml2Quantity_time_patterns_, value):
+                warnings_.warn('Value "%s" does not match xsd pattern restrictions: %s' % (value.encode('utf-8'), self.validate_Nml2Quantity_time_patterns_, ))
+    validate_Nml2Quantity_time_patterns_ = [['^-?([0-9]*(\\.[0-9]+)?)([eE]-?[0-9]+)?[\\s]*(s|ms)$']]
+    def validate_Nml2Quantity_voltage(self, value):
+        # Validate type Nml2Quantity_voltage, a restriction on xs:string.
+        if value is not None and Validate_simpletypes_:
+            if not self.gds_validate_simple_patterns(
+                    self.validate_Nml2Quantity_voltage_patterns_, value):
+                warnings_.warn('Value "%s" does not match xsd pattern restrictions: %s' % (value.encode('utf-8'), self.validate_Nml2Quantity_voltage_patterns_, ))
+    validate_Nml2Quantity_voltage_patterns_ = [['^-?([0-9]*(\\.[0-9]+)?)([eE]-?[0-9]+)?[\\s]*(V|mV)$']]
     def hasContent_(self):
         if (
 
@@ -1331,90 +1322,64 @@ class HHTime(GeneratedsSuper):
             eol_ = '\n'
         else:
             eol_ = ''
+        if self.original_tagname_ is not None:
+            name_ = self.original_tagname_
         showIndent(outfile, level, pretty_print)
         outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
         already_processed = set()
         self.exportAttributes(outfile, level, already_processed, namespace_, name_='HHTime')
         if self.hasContent_():
             outfile.write('>%s' % (eol_, ))
-            self.exportChildren(outfile, level + 1, namespace_, name_, pretty_print=pretty_print)
+            self.exportChildren(outfile, level + 1, namespace_='', name_='HHTime', pretty_print=pretty_print)
             outfile.write('</%s%s>%s' % (namespace_, name_, eol_))
         else:
             outfile.write('/>%s' % (eol_, ))
     def exportAttributes(self, outfile, level, already_processed, namespace_='', name_='HHTime'):
-        if self.midpoint is not None and 'midpoint' not in already_processed:
-            already_processed.add('midpoint')
-            outfile.write(' midpoint=%s' % (quote_attrib(self.midpoint), ))
+        if self.type_ is not None and 'type_' not in already_processed:
+            already_processed.add('type_')
+            outfile.write(' type=%s' % (quote_attrib(self.type_), ))
         if self.rate is not None and 'rate' not in already_processed:
             already_processed.add('rate')
             outfile.write(' rate=%s' % (quote_attrib(self.rate), ))
+        if self.midpoint is not None and 'midpoint' not in already_processed:
+            already_processed.add('midpoint')
+            outfile.write(' midpoint=%s' % (quote_attrib(self.midpoint), ))
         if self.scale is not None and 'scale' not in already_processed:
             already_processed.add('scale')
             outfile.write(' scale=%s' % (quote_attrib(self.scale), ))
-        if self.type_ is not None and 'type_' not in already_processed:
-            already_processed.add('type_')
-            outfile.write(' type=%s' % (quote_attrib(self.type_), ))
         if self.tau is not None and 'tau' not in already_processed:
             already_processed.add('tau')
             outfile.write(' tau=%s' % (quote_attrib(self.tau), ))
     def exportChildren(self, outfile, level, namespace_='', name_='HHTime', fromsubclass_=False, pretty_print=True):
         pass
-    def exportLiteral(self, outfile, level, name_='HHTime'):
-        level += 1
-        already_processed = set()
-        self.exportLiteralAttributes(outfile, level, already_processed, name_)
-        if self.hasContent_():
-            self.exportLiteralChildren(outfile, level, name_)
-    def exportLiteralAttributes(self, outfile, level, already_processed, name_):
-        if self.midpoint is not None and 'midpoint' not in already_processed:
-            already_processed.add('midpoint')
-            showIndent(outfile, level)
-            outfile.write('midpoint="%s",\n' % (self.midpoint,))
-        if self.rate is not None and 'rate' not in already_processed:
-            already_processed.add('rate')
-            showIndent(outfile, level)
-            outfile.write('rate="%s",\n' % (self.rate,))
-        if self.scale is not None and 'scale' not in already_processed:
-            already_processed.add('scale')
-            showIndent(outfile, level)
-            outfile.write('scale="%s",\n' % (self.scale,))
-        if self.type_ is not None and 'type_' not in already_processed:
-            already_processed.add('type_')
-            showIndent(outfile, level)
-            outfile.write('type_="%s",\n' % (self.type_,))
-        if self.tau is not None and 'tau' not in already_processed:
-            already_processed.add('tau')
-            showIndent(outfile, level)
-            outfile.write('tau="%s",\n' % (self.tau,))
-    def exportLiteralChildren(self, outfile, level, name_):
-        pass
     def build(self, node):
         already_processed = set()
         self.buildAttributes(node, node.attrib, already_processed)
         for child in node:
             nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
             self.buildChildren(child, node, nodeName_)
+        return self
     def buildAttributes(self, node, attrs, already_processed):
-        value = find_attr_value_('midpoint', node)
-        if value is not None and 'midpoint' not in already_processed:
-            already_processed.add('midpoint')
-            self.midpoint = value
-            self.validate_Nml2Quantity_voltage(self.midpoint)    # validate type Nml2Quantity_voltage
+        value = find_attr_value_('type', node)
+        if value is not None and 'type' not in already_processed:
+            already_processed.add('type')
+            self.type_ = value
+            self.validate_NmlId(self.type_)    # validate type NmlId
         value = find_attr_value_('rate', node)
         if value is not None and 'rate' not in already_processed:
             already_processed.add('rate')
             self.rate = value
             self.validate_Nml2Quantity_time(self.rate)    # validate type Nml2Quantity_time
+        value = find_attr_value_('midpoint', node)
+        if value is not None and 'midpoint' not in already_processed:
+            already_processed.add('midpoint')
+            self.midpoint = value
+            self.validate_Nml2Quantity_voltage(self.midpoint)    # validate type Nml2Quantity_voltage
         value = find_attr_value_('scale', node)
         if value is not None and 'scale' not in already_processed:
             already_processed.add('scale')
             self.scale = value
             self.validate_Nml2Quantity_voltage(self.scale)    # validate type Nml2Quantity_voltage
-        value = find_attr_value_('type', node)
-        if value is not None and 'type' not in already_processed:
-            already_processed.add('type')
-            self.type_ = value
-            self.validate_NmlId(self.type_)    # validate type NmlId
         value = find_attr_value_('tau', node)
         if value is not None and 'tau' not in already_processed:
             already_processed.add('tau')
@@ -1428,41 +1393,67 @@ class HHTime(GeneratedsSuper):
 class BlockMechanism(GeneratedsSuper):
     subclass = None
     superclass = None
-    def __init__(self, blockConcentration=None, scalingConc=None, type_=None, species=None, scalingVolt=None):
-        self.blockConcentration = _cast(None, blockConcentration)
-        self.scalingConc = _cast(None, scalingConc)
+    def __init__(self, type_=None, species=None, blockConcentration=None, scalingConc=None, scalingVolt=None):
+        self.original_tagname_ = None
         self.type_ = _cast(None, type_)
         self.species = _cast(None, species)
+        self.blockConcentration = _cast(None, blockConcentration)
+        self.scalingConc = _cast(None, scalingConc)
         self.scalingVolt = _cast(None, scalingVolt)
-        pass
     def factory(*args_, **kwargs_):
+        if CurrentSubclassModule_ is not None:
+            subclass = getSubclassFromModule_(
+                CurrentSubclassModule_, BlockMechanism)
+            if subclass is not None:
+                return subclass(*args_, **kwargs_)
         if BlockMechanism.subclass:
             return BlockMechanism.subclass(*args_, **kwargs_)
         else:
             return BlockMechanism(*args_, **kwargs_)
     factory = staticmethod(factory)
+    def get_type(self): return self.type_
+    def set_type(self, type_): self.type_ = type_
+    def get_species(self): return self.species
+    def set_species(self, species): self.species = species
     def get_blockConcentration(self): return self.blockConcentration
     def set_blockConcentration(self, blockConcentration): self.blockConcentration = blockConcentration
-    def validate_Nml2Quantity_concentration(self, value):
-        # Validate type Nml2Quantity_concentration, a restriction on xs:string.
-        pass
     def get_scalingConc(self): return self.scalingConc
     def set_scalingConc(self, scalingConc): self.scalingConc = scalingConc
-    def get_type(self): return self.type_
-    def set_type(self, type_): self.type_ = type_
+    def get_scalingVolt(self): return self.scalingVolt
+    def set_scalingVolt(self, scalingVolt): self.scalingVolt = scalingVolt
     def validate_BlockTypes(self, value):
         # Validate type BlockTypes, a restriction on xs:string.
-        pass
-    def get_species(self): return self.species
-    def set_species(self, species): self.species = species
+        if value is not None and Validate_simpletypes_:
+            value = str(value)
+            enumerations = ['voltageConcDepBlockMechanism']
+            enumeration_respectee = False
+            for enum in enumerations:
+                if value == enum:
+                    enumeration_respectee = True
+                    break
+            if not enumeration_respectee:
+                warnings_.warn('Value "%(value)s" does not match xsd enumeration restriction on BlockTypes' % {"value" : value.encode("utf-8")} )
     def validate_NmlId(self, value):
         # Validate type NmlId, a restriction on xs:string.
-        pass
-    def get_scalingVolt(self): return self.scalingVolt
-    def set_scalingVolt(self, scalingVolt): self.scalingVolt = scalingVolt
+        if value is not None and Validate_simpletypes_:
+            if not self.gds_validate_simple_patterns(
+                    self.validate_NmlId_patterns_, value):
+                warnings_.warn('Value "%s" does not match xsd pattern restrictions: %s' % (value.encode('utf-8'), self.validate_NmlId_patterns_, ))
+    validate_NmlId_patterns_ = [['^[a-zA-Z0-9_]*$']]
+    def validate_Nml2Quantity_concentration(self, value):
+        # Validate type Nml2Quantity_concentration, a restriction on xs:string.
+        if value is not None and Validate_simpletypes_:
+            if not self.gds_validate_simple_patterns(
+                    self.validate_Nml2Quantity_concentration_patterns_, value):
+                warnings_.warn('Value "%s" does not match xsd pattern restrictions: %s' % (value.encode('utf-8'), self.validate_Nml2Quantity_concentration_patterns_, ))
+    validate_Nml2Quantity_concentration_patterns_ = [['^-?([0-9]*(\\.[0-9]+)?)([eE]-?[0-9]+)?[\\s]*(mol_per_m3|mol_per_cm3|M|mM)$']]
     def validate_Nml2Quantity_voltage(self, value):
         # Validate type Nml2Quantity_voltage, a restriction on xs:string.
-        pass
+        if value is not None and Validate_simpletypes_:
+            if not self.gds_validate_simple_patterns(
+                    self.validate_Nml2Quantity_voltage_patterns_, value):
+                warnings_.warn('Value "%s" does not match xsd pattern restrictions: %s' % (value.encode('utf-8'), self.validate_Nml2Quantity_voltage_patterns_, ))
+    validate_Nml2Quantity_voltage_patterns_ = [['^-?([0-9]*(\\.[0-9]+)?)([eE]-?[0-9]+)?[\\s]*(V|mV)$']]
     def hasContent_(self):
         if (
 
@@ -1475,62 +1466,35 @@ class BlockMechanism(GeneratedsSuper):
             eol_ = '\n'
         else:
             eol_ = ''
+        if self.original_tagname_ is not None:
+            name_ = self.original_tagname_
         showIndent(outfile, level, pretty_print)
         outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
         already_processed = set()
         self.exportAttributes(outfile, level, already_processed, namespace_, name_='BlockMechanism')
         if self.hasContent_():
             outfile.write('>%s' % (eol_, ))
-            self.exportChildren(outfile, level + 1, namespace_, name_, pretty_print=pretty_print)
+            self.exportChildren(outfile, level + 1, namespace_='', name_='BlockMechanism', pretty_print=pretty_print)
             outfile.write('</%s%s>%s' % (namespace_, name_, eol_))
         else:
             outfile.write('/>%s' % (eol_, ))
     def exportAttributes(self, outfile, level, already_processed, namespace_='', name_='BlockMechanism'):
-        if self.blockConcentration is not None and 'blockConcentration' not in already_processed:
-            already_processed.add('blockConcentration')
-            outfile.write(' blockConcentration=%s' % (quote_attrib(self.blockConcentration), ))
-        if self.scalingConc is not None and 'scalingConc' not in already_processed:
-            already_processed.add('scalingConc')
-            outfile.write(' scalingConc=%s' % (quote_attrib(self.scalingConc), ))
         if self.type_ is not None and 'type_' not in already_processed:
             already_processed.add('type_')
             outfile.write(' type=%s' % (quote_attrib(self.type_), ))
         if self.species is not None and 'species' not in already_processed:
             already_processed.add('species')
             outfile.write(' species=%s' % (quote_attrib(self.species), ))
-        if self.scalingVolt is not None and 'scalingVolt' not in already_processed:
-            already_processed.add('scalingVolt')
-            outfile.write(' scalingVolt=%s' % (quote_attrib(self.scalingVolt), ))
-    def exportChildren(self, outfile, level, namespace_='', name_='BlockMechanism', fromsubclass_=False, pretty_print=True):
-        pass
-    def exportLiteral(self, outfile, level, name_='BlockMechanism'):
-        level += 1
-        already_processed = set()
-        self.exportLiteralAttributes(outfile, level, already_processed, name_)
-        if self.hasContent_():
-            self.exportLiteralChildren(outfile, level, name_)
-    def exportLiteralAttributes(self, outfile, level, already_processed, name_):
         if self.blockConcentration is not None and 'blockConcentration' not in already_processed:
             already_processed.add('blockConcentration')
-            showIndent(outfile, level)
-            outfile.write('blockConcentration="%s",\n' % (self.blockConcentration,))
+            outfile.write(' blockConcentration=%s' % (quote_attrib(self.blockConcentration), ))
         if self.scalingConc is not None and 'scalingConc' not in already_processed:
             already_processed.add('scalingConc')
-            showIndent(outfile, level)
-            outfile.write('scalingConc="%s",\n' % (self.scalingConc,))
-        if self.type_ is not None and 'type_' not in already_processed:
-            already_processed.add('type_')
-            showIndent(outfile, level)
-            outfile.write('type_="%s",\n' % (self.type_,))
-        if self.species is not None and 'species' not in already_processed:
-            already_processed.add('species')
-            showIndent(outfile, level)
-            outfile.write('species="%s",\n' % (self.species,))
+            outfile.write(' scalingConc=%s' % (quote_attrib(self.scalingConc), ))
         if self.scalingVolt is not None and 'scalingVolt' not in already_processed:
             already_processed.add('scalingVolt')
-            showIndent(outfile, level)
-            outfile.write('scalingVolt="%s",\n' % (self.scalingVolt,))
-    def exportLiteralChildren(self, outfile, level, name_):
+            outfile.write(' scalingVolt=%s' % (quote_attrib(self.scalingVolt), ))
+    def exportChildren(self, outfile, level, namespace_='', name_='BlockMechanism', fromsubclass_=False, pretty_print=True):
         pass
     def build(self, node):
         already_processed = set()
@@ -1538,17 +1502,8 @@ class BlockMechanism(GeneratedsSuper):
         for child in node:
             nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
             self.buildChildren(child, node, nodeName_)
+        return self
     def buildAttributes(self, node, attrs, already_processed):
-        value = find_attr_value_('blockConcentration', node)
-        if value is not None and 'blockConcentration' not in already_processed:
-            already_processed.add('blockConcentration')
-            self.blockConcentration = value
-            self.validate_Nml2Quantity_concentration(self.blockConcentration)    # validate type Nml2Quantity_concentration
-        value = find_attr_value_('scalingConc', node)
-        if value is not None and 'scalingConc' not in already_processed:
-            already_processed.add('scalingConc')
-            self.scalingConc = value
-            self.validate_Nml2Quantity_concentration(self.scalingConc)    # validate type Nml2Quantity_concentration
         value = find_attr_value_('type', node)
         if value is not None and 'type' not in already_processed:
             already_processed.add('type')
@@ -1559,6 +1514,16 @@ class BlockMechanism(GeneratedsSuper):
             already_processed.add('species')
             self.species = value
             self.validate_NmlId(self.species)    # validate type NmlId
+        value = find_attr_value_('blockConcentration', node)
+        if value is not None and 'blockConcentration' not in already_processed:
+            already_processed.add('blockConcentration')
+            self.blockConcentration = value
+            self.validate_Nml2Quantity_concentration(self.blockConcentration)    # validate type Nml2Quantity_concentration
+        value = find_attr_value_('scalingConc', node)
+        if value is not None and 'scalingConc' not in already_processed:
+            already_processed.add('scalingConc')
+            self.scalingConc = value
+            self.validate_Nml2Quantity_concentration(self.scalingConc)    # validate type Nml2Quantity_concentration
         value = find_attr_value_('scalingVolt', node)
         if value is not None and 'scalingVolt' not in already_processed:
             already_processed.add('scalingVolt')
@@ -1572,13 +1537,18 @@ class BlockMechanism(GeneratedsSuper):
 class PlasticityMechanism(GeneratedsSuper):
     subclass = None
     superclass = None
-    def __init__(self, type_=None, tauFac=None, tauRec=None, initReleaseProb=None):
+    def __init__(self, type_=None, initReleaseProb=None, tauRec=None, tauFac=None):
+        self.original_tagname_ = None
         self.type_ = _cast(None, type_)
-        self.tauFac = _cast(None, tauFac)
-        self.tauRec = _cast(None, tauRec)
         self.initReleaseProb = _cast(None, initReleaseProb)
-        pass
+        self.tauRec = _cast(None, tauRec)
+        self.tauFac = _cast(None, tauFac)
     def factory(*args_, **kwargs_):
+        if CurrentSubclassModule_ is not None:
+            subclass = getSubclassFromModule_(
+                CurrentSubclassModule_, PlasticityMechanism)
+            if subclass is not None:
+                return subclass(*args_, **kwargs_)
         if PlasticityMechanism.subclass:
             return PlasticityMechanism.subclass(*args_, **kwargs_)
         else:
@@ -1586,21 +1556,38 @@ class PlasticityMechanism(GeneratedsSuper):
     factory = staticmethod(factory)
     def get_type(self): return self.type_
     def set_type(self, type_): self.type_ = type_
-    def validate_PlasticityTypes(self, value):
-        # Validate type PlasticityTypes, a restriction on xs:string.
-        pass
-    def get_tauFac(self): return self.tauFac
-    def set_tauFac(self, tauFac): self.tauFac = tauFac
-    def validate_Nml2Quantity_time(self, value):
-        # Validate type Nml2Quantity_time, a restriction on xs:string.
-        pass
-    def get_tauRec(self): return self.tauRec
-    def set_tauRec(self, tauRec): self.tauRec = tauRec
     def get_initReleaseProb(self): return self.initReleaseProb
     def set_initReleaseProb(self, initReleaseProb): self.initReleaseProb = initReleaseProb
+    def get_tauRec(self): return self.tauRec
+    def set_tauRec(self, tauRec): self.tauRec = tauRec
+    def get_tauFac(self): return self.tauFac
+    def set_tauFac(self, tauFac): self.tauFac = tauFac
+    def validate_PlasticityTypes(self, value):
+        # Validate type PlasticityTypes, a restriction on xs:string.
+        if value is not None and Validate_simpletypes_:
+            value = str(value)
+            enumerations = ['tsodyksMarkramDepMechanism', 'tsodyksMarkramDepFacMechanism']
+            enumeration_respectee = False
+            for enum in enumerations:
+                if value == enum:
+                    enumeration_respectee = True
+                    break
+            if not enumeration_respectee:
+                warnings_.warn('Value "%(value)s" does not match xsd enumeration restriction on PlasticityTypes' % {"value" : value.encode("utf-8")} )
     def validate_ZeroToOne(self, value):
         # Validate type ZeroToOne, a restriction on xs:double.
-        pass
+        if value is not None and Validate_simpletypes_:
+            if value < 0:
+                warnings_.warn('Value "%(value)s" does not match xsd minInclusive restriction on ZeroToOne' % {"value" : value} )
+            if value > 1:
+                warnings_.warn('Value "%(value)s" does not match xsd maxInclusive restriction on ZeroToOne' % {"value" : value} )
+    def validate_Nml2Quantity_time(self, value):
+        # Validate type Nml2Quantity_time, a restriction on xs:string.
+        if value is not None and Validate_simpletypes_:
+            if not self.gds_validate_simple_patterns(
+                    self.validate_Nml2Quantity_time_patterns_, value):
+                warnings_.warn('Value "%s" does not match xsd pattern restrictions: %s' % (value.encode('utf-8'), self.validate_Nml2Quantity_time_patterns_, ))
+    validate_Nml2Quantity_time_patterns_ = [['^-?([0-9]*(\\.[0-9]+)?)([eE]-?[0-9]+)?[\\s]*(s|ms)$']]
     def hasContent_(self):
         if (
 
@@ -1613,13 +1600,15 @@ class PlasticityMechanism(GeneratedsSuper):
             eol_ = '\n'
         else:
             eol_ = ''
+        if self.original_tagname_ is not None:
+            name_ = self.original_tagname_
         showIndent(outfile, level, pretty_print)
         outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
         already_processed = set()
         self.exportAttributes(outfile, level, already_processed, namespace_, name_='PlasticityMechanism')
         if self.hasContent_():
             outfile.write('>%s' % (eol_, ))
-            self.exportChildren(outfile, level + 1, namespace_, name_, pretty_print=pretty_print)
+            self.exportChildren(outfile, level + 1, namespace_='', name_='PlasticityMechanism', pretty_print=pretty_print)
             outfile.write('</%s%s>%s' % (namespace_, name_, eol_))
         else:
             outfile.write('/>%s' % (eol_, ))
@@ -1627,41 +1616,16 @@ class PlasticityMechanism(GeneratedsSuper):
         if self.type_ is not None and 'type_' not in already_processed:
             already_processed.add('type_')
             outfile.write(' type=%s' % (quote_attrib(self.type_), ))
-        if self.tauFac is not None and 'tauFac' not in already_processed:
-            already_processed.add('tauFac')
-            outfile.write(' tauFac=%s' % (quote_attrib(self.tauFac), ))
-        if self.tauRec is not None and 'tauRec' not in already_processed:
-            already_processed.add('tauRec')
-            outfile.write(' tauRec=%s' % (quote_attrib(self.tauRec), ))
         if self.initReleaseProb is not None and 'initReleaseProb' not in already_processed:
             already_processed.add('initReleaseProb')
             outfile.write(' initReleaseProb=%s' % (quote_attrib(self.initReleaseProb), ))
-    def exportChildren(self, outfile, level, namespace_='', name_='PlasticityMechanism', fromsubclass_=False, pretty_print=True):
-        pass
-    def exportLiteral(self, outfile, level, name_='PlasticityMechanism'):
-        level += 1
-        already_processed = set()
-        self.exportLiteralAttributes(outfile, level, already_processed, name_)
-        if self.hasContent_():
-            self.exportLiteralChildren(outfile, level, name_)
-    def exportLiteralAttributes(self, outfile, level, already_processed, name_):
-        if self.type_ is not None and 'type_' not in already_processed:
-            already_processed.add('type_')
-            showIndent(outfile, level)
-            outfile.write('type_="%s",\n' % (self.type_,))
-        if self.tauFac is not None and 'tauFac' not in already_processed:
-            already_processed.add('tauFac')
-            showIndent(outfile, level)
-            outfile.write('tauFac="%s",\n' % (self.tauFac,))
         if self.tauRec is not None and 'tauRec' not in already_processed:
             already_processed.add('tauRec')
-            showIndent(outfile, level)
-            outfile.write('tauRec="%s",\n' % (self.tauRec,))
-        if self.initReleaseProb is not None and 'initReleaseProb' not in already_processed:
-            already_processed.add('initReleaseProb')
-            showIndent(outfile, level)
-            outfile.write('initReleaseProb=%e,\n' % (self.initReleaseProb,))
-    def exportLiteralChildren(self, outfile, level, name_):
+            outfile.write(' tauRec=%s' % (quote_attrib(self.tauRec), ))
+        if self.tauFac is not None and 'tauFac' not in already_processed:
+            already_processed.add('tauFac')
+            outfile.write(' tauFac=%s' % (quote_attrib(self.tauFac), ))
+    def exportChildren(self, outfile, level, namespace_='', name_='PlasticityMechanism', fromsubclass_=False, pretty_print=True):
         pass
     def build(self, node):
         already_processed = set()
@@ -1669,22 +1633,13 @@ class PlasticityMechanism(GeneratedsSuper):
         for child in node:
             nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
             self.buildChildren(child, node, nodeName_)
+        return self
     def buildAttributes(self, node, attrs, already_processed):
         value = find_attr_value_('type', node)
         if value is not None and 'type' not in already_processed:
             already_processed.add('type')
             self.type_ = value
             self.validate_PlasticityTypes(self.type_)    # validate type PlasticityTypes
-        value = find_attr_value_('tauFac', node)
-        if value is not None and 'tauFac' not in already_processed:
-            already_processed.add('tauFac')
-            self.tauFac = value
-            self.validate_Nml2Quantity_time(self.tauFac)    # validate type Nml2Quantity_time
-        value = find_attr_value_('tauRec', node)
-        if value is not None and 'tauRec' not in already_processed:
-            already_processed.add('tauRec')
-            self.tauRec = value
-            self.validate_Nml2Quantity_time(self.tauRec)    # validate type Nml2Quantity_time
         value = find_attr_value_('initReleaseProb', node)
         if value is not None and 'initReleaseProb' not in already_processed:
             already_processed.add('initReleaseProb')
@@ -1693,6 +1648,16 @@ class PlasticityMechanism(GeneratedsSuper):
             except ValueError as exp:
                 raise ValueError('Bad float/double attribute (initReleaseProb): %s' % exp)
             self.validate_ZeroToOne(self.initReleaseProb)    # validate type ZeroToOne
+        value = find_attr_value_('tauRec', node)
+        if value is not None and 'tauRec' not in already_processed:
+            already_processed.add('tauRec')
+            self.tauRec = value
+            self.validate_Nml2Quantity_time(self.tauRec)    # validate type Nml2Quantity_time
+        value = find_attr_value_('tauFac', node)
+        if value is not None and 'tauFac' not in already_processed:
+            already_processed.add('tauFac')
+            self.tauFac = value
+            self.validate_Nml2Quantity_time(self.tauFac)    # validate type Nml2Quantity_time
     def buildChildren(self, child_, node, nodeName_, fromsubclass_=False):
         pass
 # end class PlasticityMechanism
@@ -1701,26 +1666,36 @@ class PlasticityMechanism(GeneratedsSuper):
 class SegmentParent(GeneratedsSuper):
     subclass = None
     superclass = None
-    def __init__(self, fractionAlong='1', segment=None):
-        self.fractionAlong = _cast(None, fractionAlong)
+    def __init__(self, segment=None, fractionAlong='1'):
+        self.original_tagname_ = None
         self.segment = _cast(None, segment)
-        pass
+        self.fractionAlong = _cast(None, fractionAlong)
     def factory(*args_, **kwargs_):
+        if CurrentSubclassModule_ is not None:
+            subclass = getSubclassFromModule_(
+                CurrentSubclassModule_, SegmentParent)
+            if subclass is not None:
+                return subclass(*args_, **kwargs_)
         if SegmentParent.subclass:
             return SegmentParent.subclass(*args_, **kwargs_)
         else:
             return SegmentParent(*args_, **kwargs_)
     factory = staticmethod(factory)
-    def get_fractionAlong(self): return self.fractionAlong
-    def set_fractionAlong(self, fractionAlong): self.fractionAlong = fractionAlong
-    def validate_ZeroToOne(self, value):
-        # Validate type ZeroToOne, a restriction on xs:double.
-        pass
     def get_segment(self): return self.segment
     def set_segment(self, segment): self.segment = segment
+    def get_fractionAlong(self): return self.fractionAlong
+    def set_fractionAlong(self, fractionAlong): self.fractionAlong = fractionAlong
     def validate_SegmentId(self, value):
         # Validate type SegmentId, a restriction on xs:nonNegativeInteger.
-        pass
+        if value is not None and Validate_simpletypes_:
+            pass
+    def validate_ZeroToOne(self, value):
+        # Validate type ZeroToOne, a restriction on xs:double.
+        if value is not None and Validate_simpletypes_:
+            if value < 0:
+                warnings_.warn('Value "%(value)s" does not match xsd minInclusive restriction on ZeroToOne' % {"value" : value} )
+            if value > 1:
+                warnings_.warn('Value "%(value)s" does not match xsd maxInclusive restriction on ZeroToOne' % {"value" : value} )
     def hasContent_(self):
         if (
 
@@ -1733,57 +1708,35 @@ class SegmentParent(GeneratedsSuper):
             eol_ = '\n'
         else:
             eol_ = ''
+        if self.original_tagname_ is not None:
+            name_ = self.original_tagname_
         showIndent(outfile, level, pretty_print)
         outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
         already_processed = set()
         self.exportAttributes(outfile, level, already_processed, namespace_, name_='SegmentParent')
         if self.hasContent_():
             outfile.write('>%s' % (eol_, ))
-            self.exportChildren(outfile, level + 1, namespace_, name_, pretty_print=pretty_print)
+            self.exportChildren(outfile, level + 1, namespace_='', name_='SegmentParent', pretty_print=pretty_print)
             outfile.write('</%s%s>%s' % (namespace_, name_, eol_))
         else:
             outfile.write('/>%s' % (eol_, ))
     def exportAttributes(self, outfile, level, already_processed, namespace_='', name_='SegmentParent'):
-        if self.fractionAlong is not None and 'fractionAlong' not in already_processed:
-            already_processed.add('fractionAlong')
-            outfile.write(' fractionAlong=%s' % (quote_attrib(self.fractionAlong), ))
         if self.segment is not None and 'segment' not in already_processed:
             already_processed.add('segment')
             outfile.write(' segment=%s' % (quote_attrib(self.segment), ))
+        if self.fractionAlong != 1 and 'fractionAlong' not in already_processed:
+            already_processed.add('fractionAlong')
+            outfile.write(' fractionAlong=%s' % (quote_attrib(self.fractionAlong), ))
     def exportChildren(self, outfile, level, namespace_='', name_='SegmentParent', fromsubclass_=False, pretty_print=True):
         pass
-    def exportLiteral(self, outfile, level, name_='SegmentParent'):
-        level += 1
-        already_processed = set()
-        self.exportLiteralAttributes(outfile, level, already_processed, name_)
-        if self.hasContent_():
-            self.exportLiteralChildren(outfile, level, name_)
-    def exportLiteralAttributes(self, outfile, level, already_processed, name_):
-        if self.fractionAlong is not None and 'fractionAlong' not in already_processed:
-            already_processed.add('fractionAlong')
-            showIndent(outfile, level)
-            outfile.write('fractionAlong=%e,\n' % (self.fractionAlong,))
-        if self.segment is not None and 'segment' not in already_processed:
-            already_processed.add('segment')
-            showIndent(outfile, level)
-            outfile.write('segment=%d,\n' % (self.segment,))
-    def exportLiteralChildren(self, outfile, level, name_):
-        pass
-    def build(self, node):
+    def build(self, node):
         already_processed = set()
         self.buildAttributes(node, node.attrib, already_processed)
         for child in node:
             nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
             self.buildChildren(child, node, nodeName_)
+        return self
     def buildAttributes(self, node, attrs, already_processed):
-        value = find_attr_value_('fractionAlong', node)
-        if value is not None and 'fractionAlong' not in already_processed:
-            already_processed.add('fractionAlong')
-            try:
-                self.fractionAlong = float(value)
-            except ValueError as exp:
-                raise ValueError('Bad float/double attribute (fractionAlong): %s' % exp)
-            self.validate_ZeroToOne(self.fractionAlong)    # validate type ZeroToOne
         value = find_attr_value_('segment', node)
         if value is not None and 'segment' not in already_processed:
             already_processed.add('segment')
@@ -1794,6 +1747,14 @@ class SegmentParent(GeneratedsSuper):
             if self.segment < 0:
                 raise_parse_error(node, 'Invalid NonNegativeInteger')
             self.validate_SegmentId(self.segment)    # validate type SegmentId
+        value = find_attr_value_('fractionAlong', node)
+        if value is not None and 'fractionAlong' not in already_processed:
+            already_processed.add('fractionAlong')
+            try:
+                self.fractionAlong = float(value)
+            except ValueError as exp:
+                raise ValueError('Bad float/double attribute (fractionAlong): %s' % exp)
+            self.validate_ZeroToOne(self.fractionAlong)    # validate type ZeroToOne
     def buildChildren(self, child_, node, nodeName_, fromsubclass_=False):
         pass
 # end class SegmentParent
@@ -1803,22 +1764,27 @@ class Point3DWithDiam(GeneratedsSuper):
     """A 3D point with diameter."""
     subclass = None
     superclass = None
-    def __init__(self, y=None, x=None, z=None, diameter=None):
-        self.y = _cast(float, y)
+    def __init__(self, x=None, y=None, z=None, diameter=None):
+        self.original_tagname_ = None
         self.x = _cast(float, x)
+        self.y = _cast(float, y)
         self.z = _cast(float, z)
         self.diameter = _cast(float, diameter)
-        pass
     def factory(*args_, **kwargs_):
+        if CurrentSubclassModule_ is not None:
+            subclass = getSubclassFromModule_(
+                CurrentSubclassModule_, Point3DWithDiam)
+            if subclass is not None:
+                return subclass(*args_, **kwargs_)
         if Point3DWithDiam.subclass:
             return Point3DWithDiam.subclass(*args_, **kwargs_)
         else:
             return Point3DWithDiam(*args_, **kwargs_)
     factory = staticmethod(factory)
-    def get_y(self): return self.y
-    def set_y(self, y): self.y = y
     def get_x(self): return self.x
     def set_x(self, x): self.x = x
+    def get_y(self): return self.y
+    def set_y(self, y): self.y = y
     def get_z(self): return self.z
     def set_z(self, z): self.z = z
     def get_diameter(self): return self.diameter
@@ -1835,23 +1801,25 @@ class Point3DWithDiam(GeneratedsSuper):
             eol_ = '\n'
         else:
             eol_ = ''
+        if self.original_tagname_ is not None:
+            name_ = self.original_tagname_
         showIndent(outfile, level, pretty_print)
         outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
         already_processed = set()
         self.exportAttributes(outfile, level, already_processed, namespace_, name_='Point3DWithDiam')
         if self.hasContent_():
             outfile.write('>%s' % (eol_, ))
-            self.exportChildren(outfile, level + 1, namespace_, name_, pretty_print=pretty_print)
+            self.exportChildren(outfile, level + 1, namespace_='', name_='Point3DWithDiam', pretty_print=pretty_print)
             outfile.write('</%s%s>%s' % (namespace_, name_, eol_))
         else:
             outfile.write('/>%s' % (eol_, ))
     def exportAttributes(self, outfile, level, already_processed, namespace_='', name_='Point3DWithDiam'):
-        if self.y is not None and 'y' not in already_processed:
-            already_processed.add('y')
-            outfile.write(' y="%s"' % self.gds_format_double(self.y, input_name='y'))
         if self.x is not None and 'x' not in already_processed:
             already_processed.add('x')
             outfile.write(' x="%s"' % self.gds_format_double(self.x, input_name='x'))
+        if self.y is not None and 'y' not in already_processed:
+            already_processed.add('y')
+            outfile.write(' y="%s"' % self.gds_format_double(self.y, input_name='y'))
         if self.z is not None and 'z' not in already_processed:
             already_processed.add('z')
             outfile.write(' z="%s"' % self.gds_format_double(self.z, input_name='z'))
@@ -1860,45 +1828,14 @@ class Point3DWithDiam(GeneratedsSuper):
             outfile.write(' diameter="%s"' % self.gds_format_double(self.diameter, input_name='diameter'))
     def exportChildren(self, outfile, level, namespace_='', name_='Point3DWithDiam', fromsubclass_=False, pretty_print=True):
         pass
-    def exportLiteral(self, outfile, level, name_='Point3DWithDiam'):
-        level += 1
-        already_processed = set()
-        self.exportLiteralAttributes(outfile, level, already_processed, name_)
-        if self.hasContent_():
-            self.exportLiteralChildren(outfile, level, name_)
-    def exportLiteralAttributes(self, outfile, level, already_processed, name_):
-        if self.y is not None and 'y' not in already_processed:
-            already_processed.add('y')
-            showIndent(outfile, level)
-            outfile.write('y=%e,\n' % (self.y,))
-        if self.x is not None and 'x' not in already_processed:
-            already_processed.add('x')
-            showIndent(outfile, level)
-            outfile.write('x=%e,\n' % (self.x,))
-        if self.z is not None and 'z' not in already_processed:
-            already_processed.add('z')
-            showIndent(outfile, level)
-            outfile.write('z=%e,\n' % (self.z,))
-        if self.diameter is not None and 'diameter' not in already_processed:
-            already_processed.add('diameter')
-            showIndent(outfile, level)
-            outfile.write('diameter=%e,\n' % (self.diameter,))
-    def exportLiteralChildren(self, outfile, level, name_):
-        pass
     def build(self, node):
         already_processed = set()
         self.buildAttributes(node, node.attrib, already_processed)
         for child in node:
             nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
             self.buildChildren(child, node, nodeName_)
+        return self
     def buildAttributes(self, node, attrs, already_processed):
-        value = find_attr_value_('y', node)
-        if value is not None and 'y' not in already_processed:
-            already_processed.add('y')
-            try:
-                self.y = float(value)
-            except ValueError as exp:
-                raise ValueError('Bad float/double attribute (y): %s' % exp)
         value = find_attr_value_('x', node)
         if value is not None and 'x' not in already_processed:
             already_processed.add('x')
@@ -1906,6 +1843,13 @@ class Point3DWithDiam(GeneratedsSuper):
                 self.x = float(value)
             except ValueError as exp:
                 raise ValueError('Bad float/double attribute (x): %s' % exp)
+        value = find_attr_value_('y', node)
+        if value is not None and 'y' not in already_processed:
+            already_processed.add('y')
+            try:
+                self.y = float(value)
+            except ValueError as exp:
+                raise ValueError('Bad float/double attribute (y): %s' % exp)
         value = find_attr_value_('z', node)
         if value is not None and 'z' not in already_processed:
             already_processed.add('z')
@@ -1929,9 +1873,14 @@ class ProximalDetails(GeneratedsSuper):
     subclass = None
     superclass = None
     def __init__(self, translationStart=None):
+        self.original_tagname_ = None
         self.translationStart = _cast(float, translationStart)
-        pass
     def factory(*args_, **kwargs_):
+        if CurrentSubclassModule_ is not None:
+            subclass = getSubclassFromModule_(
+                CurrentSubclassModule_, ProximalDetails)
+            if subclass is not None:
+                return subclass(*args_, **kwargs_)
         if ProximalDetails.subclass:
             return ProximalDetails.subclass(*args_, **kwargs_)
         else:
@@ -1951,13 +1900,15 @@ class ProximalDetails(GeneratedsSuper):
             eol_ = '\n'
         else:
             eol_ = ''
+        if self.original_tagname_ is not None:
+            name_ = self.original_tagname_
         showIndent(outfile, level, pretty_print)
         outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
         already_processed = set()
         self.exportAttributes(outfile, level, already_processed, namespace_, name_='ProximalDetails')
         if self.hasContent_():
             outfile.write('>%s' % (eol_, ))
-            self.exportChildren(outfile, level + 1, namespace_, name_, pretty_print=pretty_print)
+            self.exportChildren(outfile, level + 1, namespace_='', name_='ProximalDetails', pretty_print=pretty_print)
             outfile.write('</%s%s>%s' % (namespace_, name_, eol_))
         else:
             outfile.write('/>%s' % (eol_, ))
@@ -1967,25 +1918,13 @@ class ProximalDetails(GeneratedsSuper):
             outfile.write(' translationStart="%s"' % self.gds_format_double(self.translationStart, input_name='translationStart'))
     def exportChildren(self, outfile, level, namespace_='', name_='ProximalDetails', fromsubclass_=False, pretty_print=True):
         pass
-    def exportLiteral(self, outfile, level, name_='ProximalDetails'):
-        level += 1
-        already_processed = set()
-        self.exportLiteralAttributes(outfile, level, already_processed, name_)
-        if self.hasContent_():
-            self.exportLiteralChildren(outfile, level, name_)
-    def exportLiteralAttributes(self, outfile, level, already_processed, name_):
-        if self.translationStart is not None and 'translationStart' not in already_processed:
-            already_processed.add('translationStart')
-            showIndent(outfile, level)
-            outfile.write('translationStart=%e,\n' % (self.translationStart,))
-    def exportLiteralChildren(self, outfile, level, name_):
-        pass
     def build(self, node):
         already_processed = set()
         self.buildAttributes(node, node.attrib, already_processed)
         for child in node:
             nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
             self.buildChildren(child, node, nodeName_)
+        return self
     def buildAttributes(self, node, attrs, already_processed):
         value = find_attr_value_('translationStart', node)
         if value is not None and 'translationStart' not in already_processed:
@@ -2003,9 +1942,14 @@ class DistalDetails(GeneratedsSuper):
     subclass = None
     superclass = None
     def __init__(self, normalizationEnd=None):
+        self.original_tagname_ = None
         self.normalizationEnd = _cast(float, normalizationEnd)
-        pass
     def factory(*args_, **kwargs_):
+        if CurrentSubclassModule_ is not None:
+            subclass = getSubclassFromModule_(
+                CurrentSubclassModule_, DistalDetails)
+            if subclass is not None:
+                return subclass(*args_, **kwargs_)
         if DistalDetails.subclass:
             return DistalDetails.subclass(*args_, **kwargs_)
         else:
@@ -2025,13 +1969,15 @@ class DistalDetails(GeneratedsSuper):
             eol_ = '\n'
         else:
             eol_ = ''
+        if self.original_tagname_ is not None:
+            name_ = self.original_tagname_
         showIndent(outfile, level, pretty_print)
         outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
         already_processed = set()
         self.exportAttributes(outfile, level, already_processed, namespace_, name_='DistalDetails')
         if self.hasContent_():
             outfile.write('>%s' % (eol_, ))
-            self.exportChildren(outfile, level + 1, namespace_, name_, pretty_print=pretty_print)
+            self.exportChildren(outfile, level + 1, namespace_='', name_='DistalDetails', pretty_print=pretty_print)
             outfile.write('</%s%s>%s' % (namespace_, name_, eol_))
         else:
             outfile.write('/>%s' % (eol_, ))
@@ -2041,25 +1987,13 @@ class DistalDetails(GeneratedsSuper):
             outfile.write(' normalizationEnd="%s"' % self.gds_format_double(self.normalizationEnd, input_name='normalizationEnd'))
     def exportChildren(self, outfile, level, namespace_='', name_='DistalDetails', fromsubclass_=False, pretty_print=True):
         pass
-    def exportLiteral(self, outfile, level, name_='DistalDetails'):
-        level += 1
-        already_processed = set()
-        self.exportLiteralAttributes(outfile, level, already_processed, name_)
-        if self.hasContent_():
-            self.exportLiteralChildren(outfile, level, name_)
-    def exportLiteralAttributes(self, outfile, level, already_processed, name_):
-        if self.normalizationEnd is not None and 'normalizationEnd' not in already_processed:
-            already_processed.add('normalizationEnd')
-            showIndent(outfile, level)
-            outfile.write('normalizationEnd=%e,\n' % (self.normalizationEnd,))
-    def exportLiteralChildren(self, outfile, level, name_):
-        pass
     def build(self, node):
         already_processed = set()
         self.buildAttributes(node, node.attrib, already_processed)
         for child in node:
             nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
             self.buildChildren(child, node, nodeName_)
+        return self
     def buildAttributes(self, node, attrs, already_processed):
         value = find_attr_value_('normalizationEnd', node)
         if value is not None and 'normalizationEnd' not in already_processed:
@@ -2077,9 +2011,14 @@ class Member(GeneratedsSuper):
     subclass = None
     superclass = None
     def __init__(self, segment=None):
+        self.original_tagname_ = None
         self.segment = _cast(None, segment)
-        pass
     def factory(*args_, **kwargs_):
+        if CurrentSubclassModule_ is not None:
+            subclass = getSubclassFromModule_(
+                CurrentSubclassModule_, Member)
+            if subclass is not None:
+                return subclass(*args_, **kwargs_)
         if Member.subclass:
             return Member.subclass(*args_, **kwargs_)
         else:
@@ -2089,7 +2028,8 @@ class Member(GeneratedsSuper):
     def set_segment(self, segment): self.segment = segment
     def validate_SegmentId(self, value):
         # Validate type SegmentId, a restriction on xs:nonNegativeInteger.
-        pass
+        if value is not None and Validate_simpletypes_:
+            pass
     def hasContent_(self):
         if (
 
@@ -2102,13 +2042,15 @@ class Member(GeneratedsSuper):
             eol_ = '\n'
         else:
             eol_ = ''
+        if self.original_tagname_ is not None:
+            name_ = self.original_tagname_
         showIndent(outfile, level, pretty_print)
         outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
         already_processed = set()
         self.exportAttributes(outfile, level, already_processed, namespace_, name_='Member')
         if self.hasContent_():
             outfile.write('>%s' % (eol_, ))
-            self.exportChildren(outfile, level + 1, namespace_, name_, pretty_print=pretty_print)
+            self.exportChildren(outfile, level + 1, namespace_='', name_='Member', pretty_print=pretty_print)
             outfile.write('</%s%s>%s' % (namespace_, name_, eol_))
         else:
             outfile.write('/>%s' % (eol_, ))
@@ -2118,25 +2060,13 @@ class Member(GeneratedsSuper):
             outfile.write(' segment=%s' % (quote_attrib(self.segment), ))
     def exportChildren(self, outfile, level, namespace_='', name_='Member', fromsubclass_=False, pretty_print=True):
         pass
-    def exportLiteral(self, outfile, level, name_='Member'):
-        level += 1
-        already_processed = set()
-        self.exportLiteralAttributes(outfile, level, already_processed, name_)
-        if self.hasContent_():
-            self.exportLiteralChildren(outfile, level, name_)
-    def exportLiteralAttributes(self, outfile, level, already_processed, name_):
-        if self.segment is not None and 'segment' not in already_processed:
-            already_processed.add('segment')
-            showIndent(outfile, level)
-            outfile.write('segment=%d,\n' % (self.segment,))
-    def exportLiteralChildren(self, outfile, level, name_):
-        pass
     def build(self, node):
         already_processed = set()
         self.buildAttributes(node, node.attrib, already_processed)
         for child in node:
             nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
             self.buildChildren(child, node, nodeName_)
+        return self
     def buildAttributes(self, node, attrs, already_processed):
         value = find_attr_value_('segment', node)
         if value is not None and 'segment' not in already_processed:
@@ -2157,9 +2087,14 @@ class Include(GeneratedsSuper):
     subclass = None
     superclass = None
     def __init__(self, segmentGroup=None):
+        self.original_tagname_ = None
         self.segmentGroup = _cast(None, segmentGroup)
-        pass
     def factory(*args_, **kwargs_):
+        if CurrentSubclassModule_ is not None:
+            subclass = getSubclassFromModule_(
+                CurrentSubclassModule_, Include)
+            if subclass is not None:
+                return subclass(*args_, **kwargs_)
         if Include.subclass:
             return Include.subclass(*args_, **kwargs_)
         else:
@@ -2169,7 +2104,11 @@ class Include(GeneratedsSuper):
     def set_segmentGroup(self, segmentGroup): self.segmentGroup = segmentGroup
     def validate_NmlId(self, value):
         # Validate type NmlId, a restriction on xs:string.
-        pass
+        if value is not None and Validate_simpletypes_:
+            if not self.gds_validate_simple_patterns(
+                    self.validate_NmlId_patterns_, value):
+                warnings_.warn('Value "%s" does not match xsd pattern restrictions: %s' % (value.encode('utf-8'), self.validate_NmlId_patterns_, ))
+    validate_NmlId_patterns_ = [['^[a-zA-Z0-9_]*$']]
     def hasContent_(self):
         if (
 
@@ -2182,13 +2121,15 @@ class Include(GeneratedsSuper):
             eol_ = '\n'
         else:
             eol_ = ''
+        if self.original_tagname_ is not None:
+            name_ = self.original_tagname_
         showIndent(outfile, level, pretty_print)
         outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
         already_processed = set()
         self.exportAttributes(outfile, level, already_processed, namespace_, name_='Include')
         if self.hasContent_():
             outfile.write('>%s' % (eol_, ))
-            self.exportChildren(outfile, level + 1, namespace_, name_, pretty_print=pretty_print)
+            self.exportChildren(outfile, level + 1, namespace_='', name_='Include', pretty_print=pretty_print)
             outfile.write('</%s%s>%s' % (namespace_, name_, eol_))
         else:
             outfile.write('/>%s' % (eol_, ))
@@ -2198,25 +2139,13 @@ class Include(GeneratedsSuper):
             outfile.write(' segmentGroup=%s' % (quote_attrib(self.segmentGroup), ))
     def exportChildren(self, outfile, level, namespace_='', name_='Include', fromsubclass_=False, pretty_print=True):
         pass
-    def exportLiteral(self, outfile, level, name_='Include'):
-        level += 1
-        already_processed = set()
-        self.exportLiteralAttributes(outfile, level, already_processed, name_)
-        if self.hasContent_():
-            self.exportLiteralChildren(outfile, level, name_)
-    def exportLiteralAttributes(self, outfile, level, already_processed, name_):
-        if self.segmentGroup is not None and 'segmentGroup' not in already_processed:
-            already_processed.add('segmentGroup')
-            showIndent(outfile, level)
-            outfile.write('segmentGroup="%s",\n' % (self.segmentGroup,))
-    def exportLiteralChildren(self, outfile, level, name_):
-        pass
     def build(self, node):
         already_processed = set()
         self.buildAttributes(node, node.attrib, already_processed)
         for child in node:
             nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
             self.buildChildren(child, node, nodeName_)
+        return self
     def buildAttributes(self, node, attrs, already_processed):
         value = find_attr_value_('segmentGroup', node)
         if value is not None and 'segmentGroup' not in already_processed:
@@ -2231,22 +2160,28 @@ class Include(GeneratedsSuper):
 class Path(GeneratedsSuper):
     subclass = None
     superclass = None
-    def __init__(self, fromxx=None, to=None):
-        self.fromxx = fromxx
+    def __init__(self, from_=None, to=None):
+        self.original_tagname_ = None
+        self.from_ = from_
         self.to = to
     def factory(*args_, **kwargs_):
+        if CurrentSubclassModule_ is not None:
+            subclass = getSubclassFromModule_(
+                CurrentSubclassModule_, Path)
+            if subclass is not None:
+                return subclass(*args_, **kwargs_)
         if Path.subclass:
             return Path.subclass(*args_, **kwargs_)
         else:
             return Path(*args_, **kwargs_)
     factory = staticmethod(factory)
-    def get_from(self): return self.fromxx
-    def set_from(self, fromxx): self.fromxx = fromxx
+    def get_from(self): return self.from_
+    def set_from(self, from_): self.from_ = from_
     def get_to(self): return self.to
     def set_to(self, to): self.to = to
     def hasContent_(self):
         if (
-            self.fromxx is not None or
+            self.from_ is not None or
             self.to is not None
         ):
             return True
@@ -2257,13 +2192,15 @@ class Path(GeneratedsSuper):
             eol_ = '\n'
         else:
             eol_ = ''
+        if self.original_tagname_ is not None:
+            name_ = self.original_tagname_
         showIndent(outfile, level, pretty_print)
         outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
         already_processed = set()
         self.exportAttributes(outfile, level, already_processed, namespace_, name_='Path')
         if self.hasContent_():
             outfile.write('>%s' % (eol_, ))
-            self.exportChildren(outfile, level + 1, namespace_, name_, pretty_print=pretty_print)
+            self.exportChildren(outfile, level + 1, namespace_='', name_='Path', pretty_print=pretty_print)
             showIndent(outfile, level, pretty_print)
             outfile.write('</%s%s>%s' % (namespace_, name_, eol_))
         else:
@@ -2275,70 +2212,58 @@ class Path(GeneratedsSuper):
             eol_ = '\n'
         else:
             eol_ = ''
-        if self.fromxx is not None:
-            self.fromxx.export(outfile, level, namespace_, name_='from', pretty_print=pretty_print)
+        if self.from_ is not None:
+            self.from_.export(outfile, level, namespace_, name_='from', pretty_print=pretty_print)
         if self.to is not None:
             self.to.export(outfile, level, namespace_, name_='to', pretty_print=pretty_print)
-    def exportLiteral(self, outfile, level, name_='Path'):
-        level += 1
-        already_processed = set()
-        self.exportLiteralAttributes(outfile, level, already_processed, name_)
-        if self.hasContent_():
-            self.exportLiteralChildren(outfile, level, name_)
-    def exportLiteralAttributes(self, outfile, level, already_processed, name_):
-        pass
-    def exportLiteralChildren(self, outfile, level, name_):
-        if self.fromxx is not None:
-            showIndent(outfile, level)
-            outfile.write('fromxx=model_.SegmentEndPoint(\n')
-            self.fromxx.exportLiteral(outfile, level, name_='from')
-            showIndent(outfile, level)
-            outfile.write('),\n')
-        if self.to is not None:
-            showIndent(outfile, level)
-            outfile.write('to=model_.SegmentEndPoint(\n')
-            self.to.exportLiteral(outfile, level, name_='to')
-            showIndent(outfile, level)
-            outfile.write('),\n')
     def build(self, node):
         already_processed = set()
         self.buildAttributes(node, node.attrib, already_processed)
         for child in node:
             nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
             self.buildChildren(child, node, nodeName_)
+        return self
     def buildAttributes(self, node, attrs, already_processed):
         pass
     def buildChildren(self, child_, node, nodeName_, fromsubclass_=False):
         if nodeName_ == 'from':
             obj_ = SegmentEndPoint.factory()
             obj_.build(child_)
-            self.set_from(obj_)
+            self.from_ = obj_
+            obj_.original_tagname_ = 'from'
         elif nodeName_ == 'to':
             obj_ = SegmentEndPoint.factory()
             obj_.build(child_)
-            self.set_to(obj_)
+            self.to = obj_
+            obj_.original_tagname_ = 'to'
 # end class Path
 
 
 class SubTree(GeneratedsSuper):
     subclass = None
     superclass = None
-    def __init__(self, fromxx=None, to=None):
-        self.fromxx = fromxx
+    def __init__(self, from_=None, to=None):
+        self.original_tagname_ = None
+        self.from_ = from_
         self.to = to
     def factory(*args_, **kwargs_):
+        if CurrentSubclassModule_ is not None:
+            subclass = getSubclassFromModule_(
+                CurrentSubclassModule_, SubTree)
+            if subclass is not None:
+                return subclass(*args_, **kwargs_)
         if SubTree.subclass:
             return SubTree.subclass(*args_, **kwargs_)
         else:
             return SubTree(*args_, **kwargs_)
     factory = staticmethod(factory)
-    def get_from(self): return self.fromxx
-    def set_from(self, fromxx): self.fromxx = fromxx
+    def get_from(self): return self.from_
+    def set_from(self, from_): self.from_ = from_
     def get_to(self): return self.to
     def set_to(self, to): self.to = to
     def hasContent_(self):
         if (
-            self.fromxx is not None or
+            self.from_ is not None or
             self.to is not None
         ):
             return True
@@ -2349,13 +2274,15 @@ class SubTree(GeneratedsSuper):
             eol_ = '\n'
         else:
             eol_ = ''
+        if self.original_tagname_ is not None:
+            name_ = self.original_tagname_
         showIndent(outfile, level, pretty_print)
         outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
         already_processed = set()
         self.exportAttributes(outfile, level, already_processed, namespace_, name_='SubTree')
         if self.hasContent_():
             outfile.write('>%s' % (eol_, ))
-            self.exportChildren(outfile, level + 1, namespace_, name_, pretty_print=pretty_print)
+            self.exportChildren(outfile, level + 1, namespace_='', name_='SubTree', pretty_print=pretty_print)
             showIndent(outfile, level, pretty_print)
             outfile.write('</%s%s>%s' % (namespace_, name_, eol_))
         else:
@@ -2367,48 +2294,30 @@ class SubTree(GeneratedsSuper):
             eol_ = '\n'
         else:
             eol_ = ''
-        if self.fromxx is not None:
-            self.fromxx.export(outfile, level, namespace_, name_='from', pretty_print=pretty_print)
+        if self.from_ is not None:
+            self.from_.export(outfile, level, namespace_, name_='from', pretty_print=pretty_print)
         if self.to is not None:
             self.to.export(outfile, level, namespace_, name_='to', pretty_print=pretty_print)
-    def exportLiteral(self, outfile, level, name_='SubTree'):
-        level += 1
-        already_processed = set()
-        self.exportLiteralAttributes(outfile, level, already_processed, name_)
-        if self.hasContent_():
-            self.exportLiteralChildren(outfile, level, name_)
-    def exportLiteralAttributes(self, outfile, level, already_processed, name_):
-        pass
-    def exportLiteralChildren(self, outfile, level, name_):
-        if self.fromxx is not None:
-            showIndent(outfile, level)
-            outfile.write('fromxx=model_.SegmentEndPoint(\n')
-            self.fromxx.exportLiteral(outfile, level, name_='from')
-            showIndent(outfile, level)
-            outfile.write('),\n')
-        if self.to is not None:
-            showIndent(outfile, level)
-            outfile.write('to=model_.SegmentEndPoint(\n')
-            self.to.exportLiteral(outfile, level, name_='to')
-            showIndent(outfile, level)
-            outfile.write('),\n')
     def build(self, node):
         already_processed = set()
         self.buildAttributes(node, node.attrib, already_processed)
         for child in node:
             nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
             self.buildChildren(child, node, nodeName_)
+        return self
     def buildAttributes(self, node, attrs, already_processed):
         pass
     def buildChildren(self, child_, node, nodeName_, fromsubclass_=False):
         if nodeName_ == 'from':
             obj_ = SegmentEndPoint.factory()
             obj_.build(child_)
-            self.set_from(obj_)
+            self.from_ = obj_
+            obj_.original_tagname_ = 'from'
         elif nodeName_ == 'to':
             obj_ = SegmentEndPoint.factory()
             obj_.build(child_)
-            self.set_to(obj_)
+            self.to = obj_
+            obj_.original_tagname_ = 'to'
 # end class SubTree
 
 
@@ -2416,9 +2325,14 @@ class SegmentEndPoint(GeneratedsSuper):
     subclass = None
     superclass = None
     def __init__(self, segment=None):
+        self.original_tagname_ = None
         self.segment = _cast(None, segment)
-        pass
     def factory(*args_, **kwargs_):
+        if CurrentSubclassModule_ is not None:
+            subclass = getSubclassFromModule_(
+                CurrentSubclassModule_, SegmentEndPoint)
+            if subclass is not None:
+                return subclass(*args_, **kwargs_)
         if SegmentEndPoint.subclass:
             return SegmentEndPoint.subclass(*args_, **kwargs_)
         else:
@@ -2428,7 +2342,8 @@ class SegmentEndPoint(GeneratedsSuper):
     def set_segment(self, segment): self.segment = segment
     def validate_SegmentId(self, value):
         # Validate type SegmentId, a restriction on xs:nonNegativeInteger.
-        pass
+        if value is not None and Validate_simpletypes_:
+            pass
     def hasContent_(self):
         if (
 
@@ -2441,13 +2356,15 @@ class SegmentEndPoint(GeneratedsSuper):
             eol_ = '\n'
         else:
             eol_ = ''
+        if self.original_tagname_ is not None:
+            name_ = self.original_tagname_
         showIndent(outfile, level, pretty_print)
         outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
         already_processed = set()
         self.exportAttributes(outfile, level, already_processed, namespace_, name_='SegmentEndPoint')
         if self.hasContent_():
             outfile.write('>%s' % (eol_, ))
-            self.exportChildren(outfile, level + 1, namespace_, name_, pretty_print=pretty_print)
+            self.exportChildren(outfile, level + 1, namespace_='', name_='SegmentEndPoint', pretty_print=pretty_print)
             outfile.write('</%s%s>%s' % (namespace_, name_, eol_))
         else:
             outfile.write('/>%s' % (eol_, ))
@@ -2457,25 +2374,13 @@ class SegmentEndPoint(GeneratedsSuper):
             outfile.write(' segment=%s' % (quote_attrib(self.segment), ))
     def exportChildren(self, outfile, level, namespace_='', name_='SegmentEndPoint', fromsubclass_=False, pretty_print=True):
         pass
-    def exportLiteral(self, outfile, level, name_='SegmentEndPoint'):
-        level += 1
-        already_processed = set()
-        self.exportLiteralAttributes(outfile, level, already_processed, name_)
-        if self.hasContent_():
-            self.exportLiteralChildren(outfile, level, name_)
-    def exportLiteralAttributes(self, outfile, level, already_processed, name_):
-        if self.segment is not None and 'segment' not in already_processed:
-            already_processed.add('segment')
-            showIndent(outfile, level)
-            outfile.write('segment=%d,\n' % (self.segment,))
-    def exportLiteralChildren(self, outfile, level, name_):
-        pass
     def build(self, node):
         already_processed = set()
         self.buildAttributes(node, node.attrib, already_processed)
         for child in node:
             nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
             self.buildChildren(child, node, nodeName_)
+        return self
     def buildAttributes(self, node, attrs, already_processed):
         value = find_attr_value_('segment', node)
         if value is not None and 'segment' not in already_processed:
@@ -2496,6 +2401,7 @@ class MembraneProperties(GeneratedsSuper):
     subclass = None
     superclass = None
     def __init__(self, channelPopulation=None, channelDensity=None, spikeThresh=None, specificCapacitance=None, initMembPotential=None, reversalPotential=None):
+        self.original_tagname_ = None
         if channelPopulation is None:
             self.channelPopulation = []
         else:
@@ -2521,6 +2427,11 @@ class MembraneProperties(GeneratedsSuper):
         else:
             self.reversalPotential = reversalPotential
     def factory(*args_, **kwargs_):
+        if CurrentSubclassModule_ is not None:
+            subclass = getSubclassFromModule_(
+                CurrentSubclassModule_, MembraneProperties)
+            if subclass is not None:
+                return subclass(*args_, **kwargs_)
         if MembraneProperties.subclass:
             return MembraneProperties.subclass(*args_, **kwargs_)
         else:
@@ -2529,27 +2440,33 @@ class MembraneProperties(GeneratedsSuper):
     def get_channelPopulation(self): return self.channelPopulation
     def set_channelPopulation(self, channelPopulation): self.channelPopulation = channelPopulation
     def add_channelPopulation(self, value): self.channelPopulation.append(value)
-    def insert_channelPopulation(self, index, value): self.channelPopulation[index] = value
+    def insert_channelPopulation_at(self, index, value): self.channelPopulation.insert(index, value)
+    def replace_channelPopulation_at(self, index, value): self.channelPopulation[index] = value
     def get_channelDensity(self): return self.channelDensity
     def set_channelDensity(self, channelDensity): self.channelDensity = channelDensity
     def add_channelDensity(self, value): self.channelDensity.append(value)
-    def insert_channelDensity(self, index, value): self.channelDensity[index] = value
+    def insert_channelDensity_at(self, index, value): self.channelDensity.insert(index, value)
+    def replace_channelDensity_at(self, index, value): self.channelDensity[index] = value
     def get_spikeThresh(self): return self.spikeThresh
     def set_spikeThresh(self, spikeThresh): self.spikeThresh = spikeThresh
     def add_spikeThresh(self, value): self.spikeThresh.append(value)
-    def insert_spikeThresh(self, index, value): self.spikeThresh[index] = value
+    def insert_spikeThresh_at(self, index, value): self.spikeThresh.insert(index, value)
+    def replace_spikeThresh_at(self, index, value): self.spikeThresh[index] = value
     def get_specificCapacitance(self): return self.specificCapacitance
     def set_specificCapacitance(self, specificCapacitance): self.specificCapacitance = specificCapacitance
     def add_specificCapacitance(self, value): self.specificCapacitance.append(value)
-    def insert_specificCapacitance(self, index, value): self.specificCapacitance[index] = value
+    def insert_specificCapacitance_at(self, index, value): self.specificCapacitance.insert(index, value)
+    def replace_specificCapacitance_at(self, index, value): self.specificCapacitance[index] = value
     def get_initMembPotential(self): return self.initMembPotential
     def set_initMembPotential(self, initMembPotential): self.initMembPotential = initMembPotential
     def add_initMembPotential(self, value): self.initMembPotential.append(value)
-    def insert_initMembPotential(self, index, value): self.initMembPotential[index] = value
+    def insert_initMembPotential_at(self, index, value): self.initMembPotential.insert(index, value)
+    def replace_initMembPotential_at(self, index, value): self.initMembPotential[index] = value
     def get_reversalPotential(self): return self.reversalPotential
     def set_reversalPotential(self, reversalPotential): self.reversalPotential = reversalPotential
     def add_reversalPotential(self, value): self.reversalPotential.append(value)
-    def insert_reversalPotential(self, index, value): self.reversalPotential[index] = value
+    def insert_reversalPotential_at(self, index, value): self.reversalPotential.insert(index, value)
+    def replace_reversalPotential_at(self, index, value): self.reversalPotential[index] = value
     def hasContent_(self):
         if (
             self.channelPopulation or
@@ -2567,13 +2484,15 @@ class MembraneProperties(GeneratedsSuper):
             eol_ = '\n'
         else:
             eol_ = ''
+        if self.original_tagname_ is not None:
+            name_ = self.original_tagname_
         showIndent(outfile, level, pretty_print)
         outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
         already_processed = set()
         self.exportAttributes(outfile, level, already_processed, namespace_, name_='MembraneProperties')
         if self.hasContent_():
             outfile.write('>%s' % (eol_, ))
-            self.exportChildren(outfile, level + 1, namespace_, name_, pretty_print=pretty_print)
+            self.exportChildren(outfile, level + 1, namespace_='', name_='MembraneProperties', pretty_print=pretty_print)
             showIndent(outfile, level, pretty_print)
             outfile.write('</%s%s>%s' % (namespace_, name_, eol_))
         else:
@@ -2597,93 +2516,13 @@ class MembraneProperties(GeneratedsSuper):
             initMembPotential_.export(outfile, level, namespace_, name_='initMembPotential', pretty_print=pretty_print)
         for reversalPotential_ in self.reversalPotential:
             reversalPotential_.export(outfile, level, namespace_, name_='reversalPotential', pretty_print=pretty_print)
-    def exportLiteral(self, outfile, level, name_='MembraneProperties'):
-        level += 1
-        already_processed = set()
-        self.exportLiteralAttributes(outfile, level, already_processed, name_)
-        if self.hasContent_():
-            self.exportLiteralChildren(outfile, level, name_)
-    def exportLiteralAttributes(self, outfile, level, already_processed, name_):
-        pass
-    def exportLiteralChildren(self, outfile, level, name_):
-        showIndent(outfile, level)
-        outfile.write('channelPopulation=[\n')
-        level += 1
-        for channelPopulation_ in self.channelPopulation:
-            showIndent(outfile, level)
-            outfile.write('model_.ChannelPopulation(\n')
-            channelPopulation_.exportLiteral(outfile, level, name_='ChannelPopulation')
-            showIndent(outfile, level)
-            outfile.write('),\n')
-        level -= 1
-        showIndent(outfile, level)
-        outfile.write('],\n')
-        showIndent(outfile, level)
-        outfile.write('channelDensity=[\n')
-        level += 1
-        for channelDensity_ in self.channelDensity:
-            showIndent(outfile, level)
-            outfile.write('model_.ChannelDensity(\n')
-            channelDensity_.exportLiteral(outfile, level, name_='ChannelDensity')
-            showIndent(outfile, level)
-            outfile.write('),\n')
-        level -= 1
-        showIndent(outfile, level)
-        outfile.write('],\n')
-        showIndent(outfile, level)
-        outfile.write('spikeThresh=[\n')
-        level += 1
-        for spikeThresh_ in self.spikeThresh:
-            showIndent(outfile, level)
-            outfile.write('model_.ValueAcrossSegOrSegGroup(\n')
-            spikeThresh_.exportLiteral(outfile, level, name_='ValueAcrossSegOrSegGroup')
-            showIndent(outfile, level)
-            outfile.write('),\n')
-        level -= 1
-        showIndent(outfile, level)
-        outfile.write('],\n')
-        showIndent(outfile, level)
-        outfile.write('specificCapacitance=[\n')
-        level += 1
-        for specificCapacitance_ in self.specificCapacitance:
-            showIndent(outfile, level)
-            outfile.write('model_.ValueAcrossSegOrSegGroup(\n')
-            specificCapacitance_.exportLiteral(outfile, level, name_='ValueAcrossSegOrSegGroup')
-            showIndent(outfile, level)
-            outfile.write('),\n')
-        level -= 1
-        showIndent(outfile, level)
-        outfile.write('],\n')
-        showIndent(outfile, level)
-        outfile.write('initMembPotential=[\n')
-        level += 1
-        for initMembPotential_ in self.initMembPotential:
-            showIndent(outfile, level)
-            outfile.write('model_.ValueAcrossSegOrSegGroup(\n')
-            initMembPotential_.exportLiteral(outfile, level, name_='ValueAcrossSegOrSegGroup')
-            showIndent(outfile, level)
-            outfile.write('),\n')
-        level -= 1
-        showIndent(outfile, level)
-        outfile.write('],\n')
-        showIndent(outfile, level)
-        outfile.write('reversalPotential=[\n')
-        level += 1
-        for reversalPotential_ in self.reversalPotential:
-            showIndent(outfile, level)
-            outfile.write('model_.ReversalPotential(\n')
-            reversalPotential_.exportLiteral(outfile, level, name_='ReversalPotential')
-            showIndent(outfile, level)
-            outfile.write('),\n')
-        level -= 1
-        showIndent(outfile, level)
-        outfile.write('],\n')
     def build(self, node):
         already_processed = set()
         self.buildAttributes(node, node.attrib, already_processed)
         for child in node:
             nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
             self.buildChildren(child, node, nodeName_)
+        return self
     def buildAttributes(self, node, attrs, already_processed):
         pass
     def buildChildren(self, child_, node, nodeName_, fromsubclass_=False):
@@ -2691,60 +2530,80 @@ class MembraneProperties(GeneratedsSuper):
             obj_ = ChannelPopulation.factory()
             obj_.build(child_)
             self.channelPopulation.append(obj_)
+            obj_.original_tagname_ = 'channelPopulation'
         elif nodeName_ == 'channelDensity':
             obj_ = ChannelDensity.factory()
             obj_.build(child_)
             self.channelDensity.append(obj_)
+            obj_.original_tagname_ = 'channelDensity'
         elif nodeName_ == 'spikeThresh':
             class_obj_ = self.get_class_obj_(child_, ValueAcrossSegOrSegGroup)
             obj_ = class_obj_.factory()
             obj_.build(child_)
             self.spikeThresh.append(obj_)
+            obj_.original_tagname_ = 'spikeThresh'
         elif nodeName_ == 'specificCapacitance':
             class_obj_ = self.get_class_obj_(child_, ValueAcrossSegOrSegGroup)
             obj_ = class_obj_.factory()
             obj_.build(child_)
             self.specificCapacitance.append(obj_)
+            obj_.original_tagname_ = 'specificCapacitance'
         elif nodeName_ == 'initMembPotential':
             class_obj_ = self.get_class_obj_(child_, ValueAcrossSegOrSegGroup)
             obj_ = class_obj_.factory()
             obj_.build(child_)
             self.initMembPotential.append(obj_)
+            obj_.original_tagname_ = 'initMembPotential'
         elif nodeName_ == 'reversalPotential':
             obj_ = ReversalPotential.factory()
             obj_.build(child_)
             self.reversalPotential.append(obj_)
+            obj_.original_tagname_ = 'reversalPotential'
 # end class MembraneProperties
 
 
 class ValueAcrossSegOrSegGroup(GeneratedsSuper):
     subclass = None
     superclass = None
-    def __init__(self, segment=None, segmentGroup='all', value=None, extensiontype_=None):
-        self.segment = _cast(None, segment)
-        self.segmentGroup = _cast(None, segmentGroup)
+    def __init__(self, value=None, segmentGroup='all', segment=None, extensiontype_=None):
+        self.original_tagname_ = None
         self.value = _cast(None, value)
+        self.segmentGroup = _cast(None, segmentGroup)
+        self.segment = _cast(None, segment)
         self.extensiontype_ = extensiontype_
     def factory(*args_, **kwargs_):
+        if CurrentSubclassModule_ is not None:
+            subclass = getSubclassFromModule_(
+                CurrentSubclassModule_, ValueAcrossSegOrSegGroup)
+            if subclass is not None:
+                return subclass(*args_, **kwargs_)
         if ValueAcrossSegOrSegGroup.subclass:
             return ValueAcrossSegOrSegGroup.subclass(*args_, **kwargs_)
         else:
             return ValueAcrossSegOrSegGroup(*args_, **kwargs_)
     factory = staticmethod(factory)
-    def get_segment(self): return self.segment
-    def set_segment(self, segment): self.segment = segment
-    def validate_NmlId(self, value):
-        # Validate type NmlId, a restriction on xs:string.
-        pass
-    def get_segmentGroup(self): return self.segmentGroup
-    def set_segmentGroup(self, segmentGroup): self.segmentGroup = segmentGroup
     def get_value(self): return self.value
     def set_value(self, value): self.value = value
-    def validate_Nml2Quantity(self, value):
-        # Validate type Nml2Quantity, a restriction on xs:string.
-        pass
+    def get_segmentGroup(self): return self.segmentGroup
+    def set_segmentGroup(self, segmentGroup): self.segmentGroup = segmentGroup
+    def get_segment(self): return self.segment
+    def set_segment(self, segment): self.segment = segment
     def get_extensiontype_(self): return self.extensiontype_
     def set_extensiontype_(self, extensiontype_): self.extensiontype_ = extensiontype_
+    def validate_Nml2Quantity(self, value):
+        # Validate type Nml2Quantity, a restriction on xs:string.
+        if value is not None and Validate_simpletypes_:
+            if not self.gds_validate_simple_patterns(
+                    self.validate_Nml2Quantity_patterns_, value):
+                warnings_.warn('Value "%s" does not match xsd pattern restrictions: %s' % (value.encode('utf-8'), self.validate_Nml2Quantity_patterns_, ))
+    validate_Nml2Quantity_patterns_ = [['^-?([0-9]*(\\.[0-9]+)?)([eE]-?[0-9]+)?[\\s]*([_a-zA-Z0-9])*$']]
+    def validate_NmlId(self, value):
+        # Validate type NmlId, a restriction on xs:string.
+        if value is not None and Validate_simpletypes_:
+            if not self.gds_validate_simple_patterns(
+                    self.validate_NmlId_patterns_, value):
+                warnings_.warn('Value "%s" does not match xsd pattern restrictions: %s' % (value.encode('utf-8'), self.validate_NmlId_patterns_, ))
+    validate_NmlId_patterns_ = [['^[a-zA-Z0-9_]*$']]
     def hasContent_(self):
         if (
 
@@ -2757,75 +2616,57 @@ class ValueAcrossSegOrSegGroup(GeneratedsSuper):
             eol_ = '\n'
         else:
             eol_ = ''
+        if self.original_tagname_ is not None:
+            name_ = self.original_tagname_
         showIndent(outfile, level, pretty_print)
         outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
         already_processed = set()
         self.exportAttributes(outfile, level, already_processed, namespace_, name_='ValueAcrossSegOrSegGroup')
         if self.hasContent_():
             outfile.write('>%s' % (eol_, ))
-            self.exportChildren(outfile, level + 1, namespace_, name_, pretty_print=pretty_print)
+            self.exportChildren(outfile, level + 1, namespace_='', name_='ValueAcrossSegOrSegGroup', pretty_print=pretty_print)
             outfile.write('</%s%s>%s' % (namespace_, name_, eol_))
         else:
             outfile.write('/>%s' % (eol_, ))
     def exportAttributes(self, outfile, level, already_processed, namespace_='', name_='ValueAcrossSegOrSegGroup'):
-        if self.segment is not None and 'segment' not in already_processed:
-            already_processed.add('segment')
-            outfile.write(' segment=%s' % (quote_attrib(self.segment), ))
-        if self.segmentGroup is not None and 'segmentGroup' not in already_processed:
-            already_processed.add('segmentGroup')
-            outfile.write(' segmentGroup=%s' % (quote_attrib(self.segmentGroup), ))
         if self.value is not None and 'value' not in already_processed:
             already_processed.add('value')
             outfile.write(' value=%s' % (quote_attrib(self.value), ))
+        if self.segmentGroup != "all" and 'segmentGroup' not in already_processed:
+            already_processed.add('segmentGroup')
+            outfile.write(' segmentGroup=%s' % (quote_attrib(self.segmentGroup), ))
+        if self.segment is not None and 'segment' not in already_processed:
+            already_processed.add('segment')
+            outfile.write(' segment=%s' % (quote_attrib(self.segment), ))
         if self.extensiontype_ is not None and 'xsi:type' not in already_processed:
             already_processed.add('xsi:type')
             outfile.write(' xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"')
             outfile.write(' xsi:type="%s"' % self.extensiontype_)
     def exportChildren(self, outfile, level, namespace_='', name_='ValueAcrossSegOrSegGroup', fromsubclass_=False, pretty_print=True):
         pass
-    def exportLiteral(self, outfile, level, name_='ValueAcrossSegOrSegGroup'):
-        level += 1
-        already_processed = set()
-        self.exportLiteralAttributes(outfile, level, already_processed, name_)
-        if self.hasContent_():
-            self.exportLiteralChildren(outfile, level, name_)
-    def exportLiteralAttributes(self, outfile, level, already_processed, name_):
-        if self.segment is not None and 'segment' not in already_processed:
-            already_processed.add('segment')
-            showIndent(outfile, level)
-            outfile.write('segment="%s",\n' % (self.segment,))
-        if self.segmentGroup is not None and 'segmentGroup' not in already_processed:
-            already_processed.add('segmentGroup')
-            showIndent(outfile, level)
-            outfile.write('segmentGroup="%s",\n' % (self.segmentGroup,))
-        if self.value is not None and 'value' not in already_processed:
-            already_processed.add('value')
-            showIndent(outfile, level)
-            outfile.write('value="%s",\n' % (self.value,))
-    def exportLiteralChildren(self, outfile, level, name_):
-        pass
     def build(self, node):
         already_processed = set()
         self.buildAttributes(node, node.attrib, already_processed)
         for child in node:
             nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
             self.buildChildren(child, node, nodeName_)
+        return self
     def buildAttributes(self, node, attrs, already_processed):
-        value = find_attr_value_('segment', node)
-        if value is not None and 'segment' not in already_processed:
-            already_processed.add('segment')
-            self.segment = value
-            self.validate_NmlId(self.segment)    # validate type NmlId
-        value = find_attr_value_('segmentGroup', node)
-        if value is not None and 'segmentGroup' not in already_processed:
-            already_processed.add('segmentGroup')
-            self.segmentGroup = value
-            self.validate_NmlId(self.segmentGroup)    # validate type NmlId
         value = find_attr_value_('value', node)
         if value is not None and 'value' not in already_processed:
             already_processed.add('value')
             self.value = value
             self.validate_Nml2Quantity(self.value)    # validate type Nml2Quantity
+        value = find_attr_value_('segmentGroup', node)
+        if value is not None and 'segmentGroup' not in already_processed:
+            already_processed.add('segmentGroup')
+            self.segmentGroup = value
+            self.validate_NmlId(self.segmentGroup)    # validate type NmlId
+        value = find_attr_value_('segment', node)
+        if value is not None and 'segment' not in already_processed:
+            already_processed.add('segment')
+            self.segment = value
+            self.validate_NmlId(self.segment)    # validate type NmlId
         value = find_attr_value_('xsi:type', node)
         if value is not None and 'xsi:type' not in already_processed:
             already_processed.add('xsi:type')
@@ -2838,11 +2679,17 @@ class ValueAcrossSegOrSegGroup(GeneratedsSuper):
 class VariableParameter(GeneratedsSuper):
     subclass = None
     superclass = None
-    def __init__(self, segmentGroup=None, parameter=None, inhomogeneousValue=None):
-        self.segmentGroup = _cast(None, segmentGroup)
+    def __init__(self, parameter=None, segmentGroup=None, inhomogeneousValue=None):
+        self.original_tagname_ = None
         self.parameter = _cast(None, parameter)
+        self.segmentGroup = _cast(None, segmentGroup)
         self.inhomogeneousValue = inhomogeneousValue
     def factory(*args_, **kwargs_):
+        if CurrentSubclassModule_ is not None:
+            subclass = getSubclassFromModule_(
+                CurrentSubclassModule_, VariableParameter)
+            if subclass is not None:
+                return subclass(*args_, **kwargs_)
         if VariableParameter.subclass:
             return VariableParameter.subclass(*args_, **kwargs_)
         else:
@@ -2850,10 +2697,10 @@ class VariableParameter(GeneratedsSuper):
     factory = staticmethod(factory)
     def get_inhomogeneousValue(self): return self.inhomogeneousValue
     def set_inhomogeneousValue(self, inhomogeneousValue): self.inhomogeneousValue = inhomogeneousValue
-    def get_segmentGroup(self): return self.segmentGroup
-    def set_segmentGroup(self, segmentGroup): self.segmentGroup = segmentGroup
     def get_parameter(self): return self.parameter
     def set_parameter(self, parameter): self.parameter = parameter
+    def get_segmentGroup(self): return self.segmentGroup
+    def set_segmentGroup(self, segmentGroup): self.segmentGroup = segmentGroup
     def hasContent_(self):
         if (
             self.inhomogeneousValue is not None
@@ -2866,24 +2713,26 @@ class VariableParameter(GeneratedsSuper):
             eol_ = '\n'
         else:
             eol_ = ''
+        if self.original_tagname_ is not None:
+            name_ = self.original_tagname_
         showIndent(outfile, level, pretty_print)
         outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
         already_processed = set()
         self.exportAttributes(outfile, level, already_processed, namespace_, name_='VariableParameter')
         if self.hasContent_():
             outfile.write('>%s' % (eol_, ))
-            self.exportChildren(outfile, level + 1, namespace_, name_, pretty_print=pretty_print)
+            self.exportChildren(outfile, level + 1, namespace_='', name_='VariableParameter', pretty_print=pretty_print)
             showIndent(outfile, level, pretty_print)
             outfile.write('</%s%s>%s' % (namespace_, name_, eol_))
         else:
             outfile.write('/>%s' % (eol_, ))
     def exportAttributes(self, outfile, level, already_processed, namespace_='', name_='VariableParameter'):
-        if self.segmentGroup is not None and 'segmentGroup' not in already_processed:
-            already_processed.add('segmentGroup')
-            outfile.write(' segmentGroup=%s' % (self.gds_format_string(quote_attrib(self.segmentGroup).encode(ExternalEncoding), input_name='segmentGroup'), ))
         if self.parameter is not None and 'parameter' not in already_processed:
             already_processed.add('parameter')
-            outfile.write(' parameter=%s' % (self.gds_format_string(quote_attrib(self.parameter).encode(ExternalEncoding), input_name='parameter'), ))
+            outfile.write(' parameter=%s' % (self.gds_encode(self.gds_format_string(quote_attrib(self.parameter), input_name='parameter')), ))
+        if self.segmentGroup is not None and 'segmentGroup' not in already_processed:
+            already_processed.add('segmentGroup')
+            outfile.write(' segmentGroup=%s' % (self.gds_encode(self.gds_format_string(quote_attrib(self.segmentGroup), input_name='segmentGroup')), ))
     def exportChildren(self, outfile, level, namespace_='', name_='VariableParameter', fromsubclass_=False, pretty_print=True):
         if pretty_print:
             eol_ = '\n'
@@ -2891,48 +2740,28 @@ class VariableParameter(GeneratedsSuper):
             eol_ = ''
         if self.inhomogeneousValue is not None:
             self.inhomogeneousValue.export(outfile, level, namespace_, name_='inhomogeneousValue', pretty_print=pretty_print)
-    def exportLiteral(self, outfile, level, name_='VariableParameter'):
-        level += 1
-        already_processed = set()
-        self.exportLiteralAttributes(outfile, level, already_processed, name_)
-        if self.hasContent_():
-            self.exportLiteralChildren(outfile, level, name_)
-    def exportLiteralAttributes(self, outfile, level, already_processed, name_):
-        if self.segmentGroup is not None and 'segmentGroup' not in already_processed:
-            already_processed.add('segmentGroup')
-            showIndent(outfile, level)
-            outfile.write('segmentGroup="%s",\n' % (self.segmentGroup,))
-        if self.parameter is not None and 'parameter' not in already_processed:
-            already_processed.add('parameter')
-            showIndent(outfile, level)
-            outfile.write('parameter="%s",\n' % (self.parameter,))
-    def exportLiteralChildren(self, outfile, level, name_):
-        if self.inhomogeneousValue is not None:
-            showIndent(outfile, level)
-            outfile.write('inhomogeneousValue=model_.InhomogeneousValue(\n')
-            self.inhomogeneousValue.exportLiteral(outfile, level, name_='inhomogeneousValue')
-            showIndent(outfile, level)
-            outfile.write('),\n')
     def build(self, node):
         already_processed = set()
         self.buildAttributes(node, node.attrib, already_processed)
         for child in node:
             nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
             self.buildChildren(child, node, nodeName_)
+        return self
     def buildAttributes(self, node, attrs, already_processed):
-        value = find_attr_value_('segmentGroup', node)
-        if value is not None and 'segmentGroup' not in already_processed:
-            already_processed.add('segmentGroup')
-            self.segmentGroup = value
         value = find_attr_value_('parameter', node)
         if value is not None and 'parameter' not in already_processed:
             already_processed.add('parameter')
             self.parameter = value
+        value = find_attr_value_('segmentGroup', node)
+        if value is not None and 'segmentGroup' not in already_processed:
+            already_processed.add('segmentGroup')
+            self.segmentGroup = value
     def buildChildren(self, child_, node, nodeName_, fromsubclass_=False):
         if nodeName_ == 'inhomogeneousValue':
             obj_ = InhomogeneousValue.factory()
             obj_.build(child_)
-            self.set_inhomogeneousValue(obj_)
+            self.inhomogeneousValue = obj_
+            obj_.original_tagname_ = 'inhomogeneousValue'
 # end class VariableParameter
 
 
@@ -2940,10 +2769,15 @@ class InhomogeneousValue(GeneratedsSuper):
     subclass = None
     superclass = None
     def __init__(self, inhomogeneousParam=None, value=None):
+        self.original_tagname_ = None
         self.inhomogeneousParam = _cast(None, inhomogeneousParam)
         self.value = _cast(None, value)
-        pass
     def factory(*args_, **kwargs_):
+        if CurrentSubclassModule_ is not None:
+            subclass = getSubclassFromModule_(
+                CurrentSubclassModule_, InhomogeneousValue)
+            if subclass is not None:
+                return subclass(*args_, **kwargs_)
         if InhomogeneousValue.subclass:
             return InhomogeneousValue.subclass(*args_, **kwargs_)
         else:
@@ -2965,48 +2799,34 @@ class InhomogeneousValue(GeneratedsSuper):
             eol_ = '\n'
         else:
             eol_ = ''
+        if self.original_tagname_ is not None:
+            name_ = self.original_tagname_
         showIndent(outfile, level, pretty_print)
         outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
         already_processed = set()
         self.exportAttributes(outfile, level, already_processed, namespace_, name_='InhomogeneousValue')
         if self.hasContent_():
             outfile.write('>%s' % (eol_, ))
-            self.exportChildren(outfile, level + 1, namespace_, name_, pretty_print=pretty_print)
+            self.exportChildren(outfile, level + 1, namespace_='', name_='InhomogeneousValue', pretty_print=pretty_print)
             outfile.write('</%s%s>%s' % (namespace_, name_, eol_))
         else:
             outfile.write('/>%s' % (eol_, ))
     def exportAttributes(self, outfile, level, already_processed, namespace_='', name_='InhomogeneousValue'):
         if self.inhomogeneousParam is not None and 'inhomogeneousParam' not in already_processed:
             already_processed.add('inhomogeneousParam')
-            outfile.write(' inhomogeneousParam=%s' % (self.gds_format_string(quote_attrib(self.inhomogeneousParam).encode(ExternalEncoding), input_name='inhomogeneousParam'), ))
+            outfile.write(' inhomogeneousParam=%s' % (self.gds_encode(self.gds_format_string(quote_attrib(self.inhomogeneousParam), input_name='inhomogeneousParam')), ))
         if self.value is not None and 'value' not in already_processed:
             already_processed.add('value')
-            outfile.write(' value=%s' % (self.gds_format_string(quote_attrib(self.value).encode(ExternalEncoding), input_name='value'), ))
+            outfile.write(' value=%s' % (self.gds_encode(self.gds_format_string(quote_attrib(self.value), input_name='value')), ))
     def exportChildren(self, outfile, level, namespace_='', name_='InhomogeneousValue', fromsubclass_=False, pretty_print=True):
         pass
-    def exportLiteral(self, outfile, level, name_='InhomogeneousValue'):
-        level += 1
-        already_processed = set()
-        self.exportLiteralAttributes(outfile, level, already_processed, name_)
-        if self.hasContent_():
-            self.exportLiteralChildren(outfile, level, name_)
-    def exportLiteralAttributes(self, outfile, level, already_processed, name_):
-        if self.inhomogeneousParam is not None and 'inhomogeneousParam' not in already_processed:
-            already_processed.add('inhomogeneousParam')
-            showIndent(outfile, level)
-            outfile.write('inhomogeneousParam="%s",\n' % (self.inhomogeneousParam,))
-        if self.value is not None and 'value' not in already_processed:
-            already_processed.add('value')
-            showIndent(outfile, level)
-            outfile.write('value="%s",\n' % (self.value,))
-    def exportLiteralChildren(self, outfile, level, name_):
-        pass
     def build(self, node):
         already_processed = set()
         self.buildAttributes(node, node.attrib, already_processed)
         for child in node:
             nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
             self.buildChildren(child, node, nodeName_)
+        return self
     def buildAttributes(self, node, attrs, already_processed):
         value = find_attr_value_('inhomogeneousParam', node)
         if value is not None and 'inhomogeneousParam' not in already_processed:
@@ -3024,11 +2844,16 @@ class InhomogeneousValue(GeneratedsSuper):
 class ReversalPotential(ValueAcrossSegOrSegGroup):
     subclass = None
     superclass = ValueAcrossSegOrSegGroup
-    def __init__(self, segment=None, segmentGroup='all', value=None, species=None):
-        super(ReversalPotential, self).__init__(segment, segmentGroup, value, )
+    def __init__(self, value=None, segmentGroup='all', segment=None, species=None):
+        self.original_tagname_ = None
+        super(ReversalPotential, self).__init__(value, segmentGroup, segment, )
         self.species = _cast(None, species)
-        pass
     def factory(*args_, **kwargs_):
+        if CurrentSubclassModule_ is not None:
+            subclass = getSubclassFromModule_(
+                CurrentSubclassModule_, ReversalPotential)
+            if subclass is not None:
+                return subclass(*args_, **kwargs_)
         if ReversalPotential.subclass:
             return ReversalPotential.subclass(*args_, **kwargs_)
         else:
@@ -3038,7 +2863,11 @@ class ReversalPotential(ValueAcrossSegOrSegGroup):
     def set_species(self, species): self.species = species
     def validate_NmlId(self, value):
         # Validate type NmlId, a restriction on xs:string.
-        pass
+        if value is not None and Validate_simpletypes_:
+            if not self.gds_validate_simple_patterns(
+                    self.validate_NmlId_patterns_, value):
+                warnings_.warn('Value "%s" does not match xsd pattern restrictions: %s' % (value.encode('utf-8'), self.validate_NmlId_patterns_, ))
+    validate_NmlId_patterns_ = [['^[a-zA-Z0-9_]*$']]
     def hasContent_(self):
         if (
             super(ReversalPotential, self).hasContent_()
@@ -3051,13 +2880,15 @@ class ReversalPotential(ValueAcrossSegOrSegGroup):
             eol_ = '\n'
         else:
             eol_ = ''
+        if self.original_tagname_ is not None:
+            name_ = self.original_tagname_
         showIndent(outfile, level, pretty_print)
         outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
         already_processed = set()
         self.exportAttributes(outfile, level, already_processed, namespace_, name_='ReversalPotential')
         if self.hasContent_():
             outfile.write('>%s' % (eol_, ))
-            self.exportChildren(outfile, level + 1, namespace_, name_, pretty_print=pretty_print)
+            self.exportChildren(outfile, level + 1, namespace_='', name_='ReversalPotential', pretty_print=pretty_print)
             outfile.write('</%s%s>%s' % (namespace_, name_, eol_))
         else:
             outfile.write('/>%s' % (eol_, ))
@@ -3069,27 +2900,13 @@ class ReversalPotential(ValueAcrossSegOrSegGroup):
     def exportChildren(self, outfile, level, namespace_='', name_='ReversalPotential', fromsubclass_=False, pretty_print=True):
         super(ReversalPotential, self).exportChildren(outfile, level, namespace_, name_, True, pretty_print=pretty_print)
         pass
-    def exportLiteral(self, outfile, level, name_='ReversalPotential'):
-        level += 1
-        already_processed = set()
-        self.exportLiteralAttributes(outfile, level, already_processed, name_)
-        if self.hasContent_():
-            self.exportLiteralChildren(outfile, level, name_)
-    def exportLiteralAttributes(self, outfile, level, already_processed, name_):
-        if self.species is not None and 'species' not in already_processed:
-            already_processed.add('species')
-            showIndent(outfile, level)
-            outfile.write('species="%s",\n' % (self.species,))
-        super(ReversalPotential, self).exportLiteralAttributes(outfile, level, already_processed, name_)
-    def exportLiteralChildren(self, outfile, level, name_):
-        super(ReversalPotential, self).exportLiteralChildren(outfile, level, name_)
-        pass
     def build(self, node):
         already_processed = set()
         self.buildAttributes(node, node.attrib, already_processed)
         for child in node:
             nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
             self.buildChildren(child, node, nodeName_)
+        return self
     def buildAttributes(self, node, attrs, already_processed):
         value = find_attr_value_('species', node)
         if value is not None and 'species' not in already_processed:
@@ -3109,36 +2926,49 @@ class Species(ValueAcrossSegOrSegGroup):
     select by id. TODO: remove."""
     subclass = None
     superclass = ValueAcrossSegOrSegGroup
-    def __init__(self, segment=None, segmentGroup='all', value=None, ion=None, initialExtConcentration=None, concentrationModel=None, id=None, initialConcentration=None):
-        super(Species, self).__init__(segment, segmentGroup, value, )
-        self.ion = _cast(None, ion)
-        self.initialExtConcentration = _cast(None, initialExtConcentration)
-        self.concentrationModel = _cast(None, concentrationModel)
+    def __init__(self, value=None, segmentGroup='all', segment=None, id=None, concentrationModel=None, ion=None, initialConcentration=None, initialExtConcentration=None):
+        self.original_tagname_ = None
+        super(Species, self).__init__(value, segmentGroup, segment, )
         self.id = _cast(None, id)
+        self.concentrationModel = _cast(None, concentrationModel)
+        self.ion = _cast(None, ion)
         self.initialConcentration = _cast(None, initialConcentration)
-        pass
+        self.initialExtConcentration = _cast(None, initialExtConcentration)
     def factory(*args_, **kwargs_):
+        if CurrentSubclassModule_ is not None:
+            subclass = getSubclassFromModule_(
+                CurrentSubclassModule_, Species)
+            if subclass is not None:
+                return subclass(*args_, **kwargs_)
         if Species.subclass:
             return Species.subclass(*args_, **kwargs_)
         else:
             return Species(*args_, **kwargs_)
     factory = staticmethod(factory)
+    def get_id(self): return self.id
+    def set_id(self, id): self.id = id
+    def get_concentrationModel(self): return self.concentrationModel
+    def set_concentrationModel(self, concentrationModel): self.concentrationModel = concentrationModel
     def get_ion(self): return self.ion
     def set_ion(self, ion): self.ion = ion
-    def validate_NmlId(self, value):
-        # Validate type NmlId, a restriction on xs:string.
-        pass
+    def get_initialConcentration(self): return self.initialConcentration
+    def set_initialConcentration(self, initialConcentration): self.initialConcentration = initialConcentration
     def get_initialExtConcentration(self): return self.initialExtConcentration
     def set_initialExtConcentration(self, initialExtConcentration): self.initialExtConcentration = initialExtConcentration
+    def validate_NmlId(self, value):
+        # Validate type NmlId, a restriction on xs:string.
+        if value is not None and Validate_simpletypes_:
+            if not self.gds_validate_simple_patterns(
+                    self.validate_NmlId_patterns_, value):
+                warnings_.warn('Value "%s" does not match xsd pattern restrictions: %s' % (value.encode('utf-8'), self.validate_NmlId_patterns_, ))
+    validate_NmlId_patterns_ = [['^[a-zA-Z0-9_]*$']]
     def validate_Nml2Quantity_concentration(self, value):
         # Validate type Nml2Quantity_concentration, a restriction on xs:string.
-        pass
-    def get_concentrationModel(self): return self.concentrationModel
-    def set_concentrationModel(self, concentrationModel): self.concentrationModel = concentrationModel
-    def get_id(self): return self.id
-    def set_id(self, id): self.id = id
-    def get_initialConcentration(self): return self.initialConcentration
-    def set_initialConcentration(self, initialConcentration): self.initialConcentration = initialConcentration
+        if value is not None and Validate_simpletypes_:
+            if not self.gds_validate_simple_patterns(
+                    self.validate_Nml2Quantity_concentration_patterns_, value):
+                warnings_.warn('Value "%s" does not match xsd pattern restrictions: %s' % (value.encode('utf-8'), self.validate_Nml2Quantity_concentration_patterns_, ))
+    validate_Nml2Quantity_concentration_patterns_ = [['^-?([0-9]*(\\.[0-9]+)?)([eE]-?[0-9]+)?[\\s]*(mol_per_m3|mol_per_cm3|M|mM)$']]
     def hasContent_(self):
         if (
             super(Species, self).hasContent_()
@@ -3151,66 +2981,37 @@ class Species(ValueAcrossSegOrSegGroup):
             eol_ = '\n'
         else:
             eol_ = ''
+        if self.original_tagname_ is not None:
+            name_ = self.original_tagname_
         showIndent(outfile, level, pretty_print)
         outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
         already_processed = set()
         self.exportAttributes(outfile, level, already_processed, namespace_, name_='Species')
         if self.hasContent_():
             outfile.write('>%s' % (eol_, ))
-            self.exportChildren(outfile, level + 1, namespace_, name_, pretty_print=pretty_print)
+            self.exportChildren(outfile, level + 1, namespace_='', name_='Species', pretty_print=pretty_print)
             outfile.write('</%s%s>%s' % (namespace_, name_, eol_))
         else:
             outfile.write('/>%s' % (eol_, ))
     def exportAttributes(self, outfile, level, already_processed, namespace_='', name_='Species'):
         super(Species, self).exportAttributes(outfile, level, already_processed, namespace_, name_='Species')
-        if self.ion is not None and 'ion' not in already_processed:
-            already_processed.add('ion')
-            outfile.write(' ion=%s' % (quote_attrib(self.ion), ))
-        if self.initialExtConcentration is not None and 'initialExtConcentration' not in already_processed:
-            already_processed.add('initialExtConcentration')
-            outfile.write(' initialExtConcentration=%s' % (quote_attrib(self.initialExtConcentration), ))
-        if self.concentrationModel is not None and 'concentrationModel' not in already_processed:
-            already_processed.add('concentrationModel')
-            outfile.write(' concentrationModel=%s' % (quote_attrib(self.concentrationModel), ))
         if self.id is not None and 'id' not in already_processed:
             already_processed.add('id')
             outfile.write(' id=%s' % (quote_attrib(self.id), ))
+        if self.concentrationModel is not None and 'concentrationModel' not in already_processed:
+            already_processed.add('concentrationModel')
+            outfile.write(' concentrationModel=%s' % (quote_attrib(self.concentrationModel), ))
+        if self.ion is not None and 'ion' not in already_processed:
+            already_processed.add('ion')
+            outfile.write(' ion=%s' % (quote_attrib(self.ion), ))
         if self.initialConcentration is not None and 'initialConcentration' not in already_processed:
             already_processed.add('initialConcentration')
             outfile.write(' initialConcentration=%s' % (quote_attrib(self.initialConcentration), ))
-    def exportChildren(self, outfile, level, namespace_='', name_='Species', fromsubclass_=False, pretty_print=True):
-        super(Species, self).exportChildren(outfile, level, namespace_, name_, True, pretty_print=pretty_print)
-        pass
-    def exportLiteral(self, outfile, level, name_='Species'):
-        level += 1
-        already_processed = set()
-        self.exportLiteralAttributes(outfile, level, already_processed, name_)
-        if self.hasContent_():
-            self.exportLiteralChildren(outfile, level, name_)
-    def exportLiteralAttributes(self, outfile, level, already_processed, name_):
-        if self.ion is not None and 'ion' not in already_processed:
-            already_processed.add('ion')
-            showIndent(outfile, level)
-            outfile.write('ion="%s",\n' % (self.ion,))
         if self.initialExtConcentration is not None and 'initialExtConcentration' not in already_processed:
             already_processed.add('initialExtConcentration')
-            showIndent(outfile, level)
-            outfile.write('initialExtConcentration="%s",\n' % (self.initialExtConcentration,))
-        if self.concentrationModel is not None and 'concentrationModel' not in already_processed:
-            already_processed.add('concentrationModel')
-            showIndent(outfile, level)
-            outfile.write('concentrationModel="%s",\n' % (self.concentrationModel,))
-        if self.id is not None and 'id' not in already_processed:
-            already_processed.add('id')
-            showIndent(outfile, level)
-            outfile.write('id="%s",\n' % (self.id,))
-        if self.initialConcentration is not None and 'initialConcentration' not in already_processed:
-            already_processed.add('initialConcentration')
-            showIndent(outfile, level)
-            outfile.write('initialConcentration="%s",\n' % (self.initialConcentration,))
-        super(Species, self).exportLiteralAttributes(outfile, level, already_processed, name_)
-    def exportLiteralChildren(self, outfile, level, name_):
-        super(Species, self).exportLiteralChildren(outfile, level, name_)
+            outfile.write(' initialExtConcentration=%s' % (quote_attrib(self.initialExtConcentration), ))
+    def exportChildren(self, outfile, level, namespace_='', name_='Species', fromsubclass_=False, pretty_print=True):
+        super(Species, self).exportChildren(outfile, level, namespace_, name_, True, pretty_print=pretty_print)
         pass
     def build(self, node):
         already_processed = set()
@@ -3218,32 +3019,33 @@ class Species(ValueAcrossSegOrSegGroup):
         for child in node:
             nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
             self.buildChildren(child, node, nodeName_)
+        return self
     def buildAttributes(self, node, attrs, already_processed):
-        value = find_attr_value_('ion', node)
-        if value is not None and 'ion' not in already_processed:
-            already_processed.add('ion')
-            self.ion = value
-            self.validate_NmlId(self.ion)    # validate type NmlId
-        value = find_attr_value_('initialExtConcentration', node)
-        if value is not None and 'initialExtConcentration' not in already_processed:
-            already_processed.add('initialExtConcentration')
-            self.initialExtConcentration = value
-            self.validate_Nml2Quantity_concentration(self.initialExtConcentration)    # validate type Nml2Quantity_concentration
-        value = find_attr_value_('concentrationModel', node)
-        if value is not None and 'concentrationModel' not in already_processed:
-            already_processed.add('concentrationModel')
-            self.concentrationModel = value
-            self.validate_NmlId(self.concentrationModel)    # validate type NmlId
         value = find_attr_value_('id', node)
         if value is not None and 'id' not in already_processed:
             already_processed.add('id')
             self.id = value
             self.validate_NmlId(self.id)    # validate type NmlId
+        value = find_attr_value_('concentrationModel', node)
+        if value is not None and 'concentrationModel' not in already_processed:
+            already_processed.add('concentrationModel')
+            self.concentrationModel = value
+            self.validate_NmlId(self.concentrationModel)    # validate type NmlId
+        value = find_attr_value_('ion', node)
+        if value is not None and 'ion' not in already_processed:
+            already_processed.add('ion')
+            self.ion = value
+            self.validate_NmlId(self.ion)    # validate type NmlId
         value = find_attr_value_('initialConcentration', node)
         if value is not None and 'initialConcentration' not in already_processed:
             already_processed.add('initialConcentration')
             self.initialConcentration = value
             self.validate_Nml2Quantity_concentration(self.initialConcentration)    # validate type Nml2Quantity_concentration
+        value = find_attr_value_('initialExtConcentration', node)
+        if value is not None and 'initialExtConcentration' not in already_processed:
+            already_processed.add('initialExtConcentration')
+            self.initialExtConcentration = value
+            self.validate_Nml2Quantity_concentration(self.initialExtConcentration)    # validate type Nml2Quantity_concentration
         super(Species, self).buildAttributes(node, attrs, already_processed)
     def buildChildren(self, child_, node, nodeName_, fromsubclass_=False):
         super(Species, self).buildChildren(child_, node, nodeName_, True)
@@ -3255,6 +3057,7 @@ class IntracellularProperties(GeneratedsSuper):
     subclass = None
     superclass = None
     def __init__(self, species=None, resistivity=None):
+        self.original_tagname_ = None
         if species is None:
             self.species = []
         else:
@@ -3264,6 +3067,11 @@ class IntracellularProperties(GeneratedsSuper):
         else:
             self.resistivity = resistivity
     def factory(*args_, **kwargs_):
+        if CurrentSubclassModule_ is not None:
+            subclass = getSubclassFromModule_(
+                CurrentSubclassModule_, IntracellularProperties)
+            if subclass is not None:
+                return subclass(*args_, **kwargs_)
         if IntracellularProperties.subclass:
             return IntracellularProperties.subclass(*args_, **kwargs_)
         else:
@@ -3272,11 +3080,13 @@ class IntracellularProperties(GeneratedsSuper):
     def get_species(self): return self.species
     def set_species(self, species): self.species = species
     def add_species(self, value): self.species.append(value)
-    def insert_species(self, index, value): self.species[index] = value
+    def insert_species_at(self, index, value): self.species.insert(index, value)
+    def replace_species_at(self, index, value): self.species[index] = value
     def get_resistivity(self): return self.resistivity
     def set_resistivity(self, resistivity): self.resistivity = resistivity
     def add_resistivity(self, value): self.resistivity.append(value)
-    def insert_resistivity(self, index, value): self.resistivity[index] = value
+    def insert_resistivity_at(self, index, value): self.resistivity.insert(index, value)
+    def replace_resistivity_at(self, index, value): self.resistivity[index] = value
     def hasContent_(self):
         if (
             self.species or
@@ -3290,13 +3100,15 @@ class IntracellularProperties(GeneratedsSuper):
             eol_ = '\n'
         else:
             eol_ = ''
+        if self.original_tagname_ is not None:
+            name_ = self.original_tagname_
         showIndent(outfile, level, pretty_print)
         outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
         already_processed = set()
         self.exportAttributes(outfile, level, already_processed, namespace_, name_='IntracellularProperties')
         if self.hasContent_():
             outfile.write('>%s' % (eol_, ))
-            self.exportChildren(outfile, level + 1, namespace_, name_, pretty_print=pretty_print)
+            self.exportChildren(outfile, level + 1, namespace_='', name_='IntracellularProperties', pretty_print=pretty_print)
             showIndent(outfile, level, pretty_print)
             outfile.write('</%s%s>%s' % (namespace_, name_, eol_))
         else:
@@ -3312,45 +3124,13 @@ class IntracellularProperties(GeneratedsSuper):
             species_.export(outfile, level, namespace_, name_='species', pretty_print=pretty_print)
         for resistivity_ in self.resistivity:
             resistivity_.export(outfile, level, namespace_, name_='resistivity', pretty_print=pretty_print)
-    def exportLiteral(self, outfile, level, name_='IntracellularProperties'):
-        level += 1
-        already_processed = set()
-        self.exportLiteralAttributes(outfile, level, already_processed, name_)
-        if self.hasContent_():
-            self.exportLiteralChildren(outfile, level, name_)
-    def exportLiteralAttributes(self, outfile, level, already_processed, name_):
-        pass
-    def exportLiteralChildren(self, outfile, level, name_):
-        showIndent(outfile, level)
-        outfile.write('species=[\n')
-        level += 1
-        for species_ in self.species:
-            showIndent(outfile, level)
-            outfile.write('model_.Species(\n')
-            species_.exportLiteral(outfile, level, name_='Species')
-            showIndent(outfile, level)
-            outfile.write('),\n')
-        level -= 1
-        showIndent(outfile, level)
-        outfile.write('],\n')
-        showIndent(outfile, level)
-        outfile.write('resistivity=[\n')
-        level += 1
-        for resistivity_ in self.resistivity:
-            showIndent(outfile, level)
-            outfile.write('model_.ValueAcrossSegOrSegGroup(\n')
-            resistivity_.exportLiteral(outfile, level, name_='ValueAcrossSegOrSegGroup')
-            showIndent(outfile, level)
-            outfile.write('),\n')
-        level -= 1
-        showIndent(outfile, level)
-        outfile.write('],\n')
     def build(self, node):
         already_processed = set()
         self.buildAttributes(node, node.attrib, already_processed)
         for child in node:
             nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
             self.buildChildren(child, node, nodeName_)
+        return self
     def buildAttributes(self, node, attrs, already_processed):
         pass
     def buildChildren(self, child_, node, nodeName_, fromsubclass_=False):
@@ -3358,11 +3138,13 @@ class IntracellularProperties(GeneratedsSuper):
             obj_ = Species.factory()
             obj_.build(child_)
             self.species.append(obj_)
+            obj_.original_tagname_ = 'species'
         elif nodeName_ == 'resistivity':
             class_obj_ = self.get_class_obj_(child_, ValueAcrossSegOrSegGroup)
             obj_ = class_obj_.factory()
             obj_.build(child_)
             self.resistivity.append(obj_)
+            obj_.original_tagname_ = 'resistivity'
 # end class IntracellularProperties
 
 
@@ -3370,12 +3152,18 @@ class ExtracellularPropertiesLocal(GeneratedsSuper):
     subclass = None
     superclass = None
     def __init__(self, temperature=None, species=None):
+        self.original_tagname_ = None
         self.temperature = _cast(None, temperature)
         if species is None:
             self.species = []
         else:
             self.species = species
     def factory(*args_, **kwargs_):
+        if CurrentSubclassModule_ is not None:
+            subclass = getSubclassFromModule_(
+                CurrentSubclassModule_, ExtracellularPropertiesLocal)
+            if subclass is not None:
+                return subclass(*args_, **kwargs_)
         if ExtracellularPropertiesLocal.subclass:
             return ExtracellularPropertiesLocal.subclass(*args_, **kwargs_)
         else:
@@ -3384,12 +3172,17 @@ class ExtracellularPropertiesLocal(GeneratedsSuper):
     def get_species(self): return self.species
     def set_species(self, species): self.species = species
     def add_species(self, value): self.species.append(value)
-    def insert_species(self, index, value): self.species[index] = value
+    def insert_species_at(self, index, value): self.species.insert(index, value)
+    def replace_species_at(self, index, value): self.species[index] = value
     def get_temperature(self): return self.temperature
     def set_temperature(self, temperature): self.temperature = temperature
     def validate_Nml2Quantity_temperature(self, value):
         # Validate type Nml2Quantity_temperature, a restriction on xs:string.
-        pass
+        if value is not None and Validate_simpletypes_:
+            if not self.gds_validate_simple_patterns(
+                    self.validate_Nml2Quantity_temperature_patterns_, value):
+                warnings_.warn('Value "%s" does not match xsd pattern restrictions: %s' % (value.encode('utf-8'), self.validate_Nml2Quantity_temperature_patterns_, ))
+    validate_Nml2Quantity_temperature_patterns_ = [['^-?([0-9]*(\\.[0-9]+)?)([eE]-?[0-9]+)?[\\s]*(degC)$']]
     def hasContent_(self):
         if (
             self.species
@@ -3402,13 +3195,15 @@ class ExtracellularPropertiesLocal(GeneratedsSuper):
             eol_ = '\n'
         else:
             eol_ = ''
+        if self.original_tagname_ is not None:
+            name_ = self.original_tagname_
         showIndent(outfile, level, pretty_print)
         outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
         already_processed = set()
         self.exportAttributes(outfile, level, already_processed, namespace_, name_='ExtracellularPropertiesLocal')
         if self.hasContent_():
             outfile.write('>%s' % (eol_, ))
-            self.exportChildren(outfile, level + 1, namespace_, name_, pretty_print=pretty_print)
+            self.exportChildren(outfile, level + 1, namespace_='', name_='ExtracellularPropertiesLocal', pretty_print=pretty_print)
             showIndent(outfile, level, pretty_print)
             outfile.write('</%s%s>%s' % (namespace_, name_, eol_))
         else:
@@ -3424,36 +3219,13 @@ class ExtracellularPropertiesLocal(GeneratedsSuper):
             eol_ = ''
         for species_ in self.species:
             species_.export(outfile, level, namespace_, name_='species', pretty_print=pretty_print)
-    def exportLiteral(self, outfile, level, name_='ExtracellularPropertiesLocal'):
-        level += 1
-        already_processed = set()
-        self.exportLiteralAttributes(outfile, level, already_processed, name_)
-        if self.hasContent_():
-            self.exportLiteralChildren(outfile, level, name_)
-    def exportLiteralAttributes(self, outfile, level, already_processed, name_):
-        if self.temperature is not None and 'temperature' not in already_processed:
-            already_processed.add('temperature')
-            showIndent(outfile, level)
-            outfile.write('temperature="%s",\n' % (self.temperature,))
-    def exportLiteralChildren(self, outfile, level, name_):
-        showIndent(outfile, level)
-        outfile.write('species=[\n')
-        level += 1
-        for species_ in self.species:
-            showIndent(outfile, level)
-            outfile.write('model_.Species(\n')
-            species_.exportLiteral(outfile, level, name_='Species')
-            showIndent(outfile, level)
-            outfile.write('),\n')
-        level -= 1
-        showIndent(outfile, level)
-        outfile.write('],\n')
     def build(self, node):
         already_processed = set()
         self.buildAttributes(node, node.attrib, already_processed)
         for child in node:
             nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
             self.buildChildren(child, node, nodeName_)
+        return self
     def buildAttributes(self, node, attrs, already_processed):
         value = find_attr_value_('temperature', node)
         if value is not None and 'temperature' not in already_processed:
@@ -3465,38 +3237,44 @@ class ExtracellularPropertiesLocal(GeneratedsSuper):
             obj_ = Species.factory()
             obj_.build(child_)
             self.species.append(obj_)
+            obj_.original_tagname_ = 'species'
 # end class ExtracellularPropertiesLocal
 
 
 class SpaceStructure(GeneratedsSuper):
     subclass = None
     superclass = None
-    def __init__(self, ySpacing=None, zStart=0, yStart=0, zSpacing=None, xStart=0, xSpacing=None):
+    def __init__(self, xSpacing=None, ySpacing=None, zSpacing=None, xStart=0, yStart=0, zStart=0):
+        self.original_tagname_ = None
+        self.xSpacing = _cast(float, xSpacing)
         self.ySpacing = _cast(float, ySpacing)
-        self.zStart = _cast(float, zStart)
-        self.yStart = _cast(float, yStart)
         self.zSpacing = _cast(float, zSpacing)
         self.xStart = _cast(float, xStart)
-        self.xSpacing = _cast(float, xSpacing)
-        pass
+        self.yStart = _cast(float, yStart)
+        self.zStart = _cast(float, zStart)
     def factory(*args_, **kwargs_):
+        if CurrentSubclassModule_ is not None:
+            subclass = getSubclassFromModule_(
+                CurrentSubclassModule_, SpaceStructure)
+            if subclass is not None:
+                return subclass(*args_, **kwargs_)
         if SpaceStructure.subclass:
             return SpaceStructure.subclass(*args_, **kwargs_)
         else:
             return SpaceStructure(*args_, **kwargs_)
     factory = staticmethod(factory)
+    def get_xSpacing(self): return self.xSpacing
+    def set_xSpacing(self, xSpacing): self.xSpacing = xSpacing
     def get_ySpacing(self): return self.ySpacing
     def set_ySpacing(self, ySpacing): self.ySpacing = ySpacing
-    def get_zStart(self): return self.zStart
-    def set_zStart(self, zStart): self.zStart = zStart
-    def get_yStart(self): return self.yStart
-    def set_yStart(self, yStart): self.yStart = yStart
     def get_zSpacing(self): return self.zSpacing
     def set_zSpacing(self, zSpacing): self.zSpacing = zSpacing
     def get_xStart(self): return self.xStart
     def set_xStart(self, xStart): self.xStart = xStart
-    def get_xSpacing(self): return self.xSpacing
-    def set_xSpacing(self, xSpacing): self.xSpacing = xSpacing
+    def get_yStart(self): return self.yStart
+    def set_yStart(self, yStart): self.yStart = yStart
+    def get_zStart(self): return self.zStart
+    def set_zStart(self, zStart): self.zStart = zStart
     def hasContent_(self):
         if (
 
@@ -3509,69 +3287,38 @@ class SpaceStructure(GeneratedsSuper):
             eol_ = '\n'
         else:
             eol_ = ''
+        if self.original_tagname_ is not None:
+            name_ = self.original_tagname_
         showIndent(outfile, level, pretty_print)
         outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
         already_processed = set()
         self.exportAttributes(outfile, level, already_processed, namespace_, name_='SpaceStructure')
         if self.hasContent_():
             outfile.write('>%s' % (eol_, ))
-            self.exportChildren(outfile, level + 1, namespace_, name_, pretty_print=pretty_print)
+            self.exportChildren(outfile, level + 1, namespace_='', name_='SpaceStructure', pretty_print=pretty_print)
             outfile.write('</%s%s>%s' % (namespace_, name_, eol_))
         else:
             outfile.write('/>%s' % (eol_, ))
     def exportAttributes(self, outfile, level, already_processed, namespace_='', name_='SpaceStructure'):
+        if self.xSpacing is not None and 'xSpacing' not in already_processed:
+            already_processed.add('xSpacing')
+            outfile.write(' xSpacing="%s"' % self.gds_format_float(self.xSpacing, input_name='xSpacing'))
         if self.ySpacing is not None and 'ySpacing' not in already_processed:
             already_processed.add('ySpacing')
             outfile.write(' ySpacing="%s"' % self.gds_format_float(self.ySpacing, input_name='ySpacing'))
-        if self.zStart is not None and 'zStart' not in already_processed:
-            already_processed.add('zStart')
-            outfile.write(' zStart="%s"' % self.gds_format_float(self.zStart, input_name='zStart'))
-        if self.yStart is not None and 'yStart' not in already_processed:
-            already_processed.add('yStart')
-            outfile.write(' yStart="%s"' % self.gds_format_float(self.yStart, input_name='yStart'))
         if self.zSpacing is not None and 'zSpacing' not in already_processed:
             already_processed.add('zSpacing')
             outfile.write(' zSpacing="%s"' % self.gds_format_float(self.zSpacing, input_name='zSpacing'))
-        if self.xStart is not None and 'xStart' not in already_processed:
+        if self.xStart != 0 and 'xStart' not in already_processed:
             already_processed.add('xStart')
             outfile.write(' xStart="%s"' % self.gds_format_float(self.xStart, input_name='xStart'))
-        if self.xSpacing is not None and 'xSpacing' not in already_processed:
-            already_processed.add('xSpacing')
-            outfile.write(' xSpacing="%s"' % self.gds_format_float(self.xSpacing, input_name='xSpacing'))
-    def exportChildren(self, outfile, level, namespace_='', name_='SpaceStructure', fromsubclass_=False, pretty_print=True):
-        pass
-    def exportLiteral(self, outfile, level, name_='SpaceStructure'):
-        level += 1
-        already_processed = set()
-        self.exportLiteralAttributes(outfile, level, already_processed, name_)
-        if self.hasContent_():
-            self.exportLiteralChildren(outfile, level, name_)
-    def exportLiteralAttributes(self, outfile, level, already_processed, name_):
-        if self.ySpacing is not None and 'ySpacing' not in already_processed:
-            already_processed.add('ySpacing')
-            showIndent(outfile, level)
-            outfile.write('ySpacing=%f,\n' % (self.ySpacing,))
-        if self.zStart is not None and 'zStart' not in already_processed:
-            already_processed.add('zStart')
-            showIndent(outfile, level)
-            outfile.write('zStart=%f,\n' % (self.zStart,))
-        if self.yStart is not None and 'yStart' not in already_processed:
+        if self.yStart != 0 and 'yStart' not in already_processed:
             already_processed.add('yStart')
-            showIndent(outfile, level)
-            outfile.write('yStart=%f,\n' % (self.yStart,))
-        if self.zSpacing is not None and 'zSpacing' not in already_processed:
-            already_processed.add('zSpacing')
-            showIndent(outfile, level)
-            outfile.write('zSpacing=%f,\n' % (self.zSpacing,))
-        if self.xStart is not None and 'xStart' not in already_processed:
-            already_processed.add('xStart')
-            showIndent(outfile, level)
-            outfile.write('xStart=%f,\n' % (self.xStart,))
-        if self.xSpacing is not None and 'xSpacing' not in already_processed:
-            already_processed.add('xSpacing')
-            showIndent(outfile, level)
-            outfile.write('xSpacing=%f,\n' % (self.xSpacing,))
-    def exportLiteralChildren(self, outfile, level, name_):
+            outfile.write(' yStart="%s"' % self.gds_format_float(self.yStart, input_name='yStart'))
+        if self.zStart != 0 and 'zStart' not in already_processed:
+            already_processed.add('zStart')
+            outfile.write(' zStart="%s"' % self.gds_format_float(self.zStart, input_name='zStart'))
+    def exportChildren(self, outfile, level, namespace_='', name_='SpaceStructure', fromsubclass_=False, pretty_print=True):
         pass
     def build(self, node):
         already_processed = set()
@@ -3579,7 +3326,15 @@ class SpaceStructure(GeneratedsSuper):
         for child in node:
             nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
             self.buildChildren(child, node, nodeName_)
+        return self
     def buildAttributes(self, node, attrs, already_processed):
+        value = find_attr_value_('xSpacing', node)
+        if value is not None and 'xSpacing' not in already_processed:
+            already_processed.add('xSpacing')
+            try:
+                self.xSpacing = float(value)
+            except ValueError as exp:
+                raise ValueError('Bad float/double attribute (xSpacing): %s' % exp)
         value = find_attr_value_('ySpacing', node)
         if value is not None and 'ySpacing' not in already_processed:
             already_processed.add('ySpacing')
@@ -3587,20 +3342,6 @@ class SpaceStructure(GeneratedsSuper):
                 self.ySpacing = float(value)
             except ValueError as exp:
                 raise ValueError('Bad float/double attribute (ySpacing): %s' % exp)
-        value = find_attr_value_('zStart', node)
-        if value is not None and 'zStart' not in already_processed:
-            already_processed.add('zStart')
-            try:
-                self.zStart = float(value)
-            except ValueError as exp:
-                raise ValueError('Bad float/double attribute (zStart): %s' % exp)
-        value = find_attr_value_('yStart', node)
-        if value is not None and 'yStart' not in already_processed:
-            already_processed.add('yStart')
-            try:
-                self.yStart = float(value)
-            except ValueError as exp:
-                raise ValueError('Bad float/double attribute (yStart): %s' % exp)
         value = find_attr_value_('zSpacing', node)
         if value is not None and 'zSpacing' not in already_processed:
             already_processed.add('zSpacing')
@@ -3615,13 +3356,20 @@ class SpaceStructure(GeneratedsSuper):
                 self.xStart = float(value)
             except ValueError as exp:
                 raise ValueError('Bad float/double attribute (xStart): %s' % exp)
-        value = find_attr_value_('xSpacing', node)
-        if value is not None and 'xSpacing' not in already_processed:
-            already_processed.add('xSpacing')
+        value = find_attr_value_('yStart', node)
+        if value is not None and 'yStart' not in already_processed:
+            already_processed.add('yStart')
             try:
-                self.xSpacing = float(value)
+                self.yStart = float(value)
             except ValueError as exp:
-                raise ValueError('Bad float/double attribute (xSpacing): %s' % exp)
+                raise ValueError('Bad float/double attribute (yStart): %s' % exp)
+        value = find_attr_value_('zStart', node)
+        if value is not None and 'zStart' not in already_processed:
+            already_processed.add('zStart')
+            try:
+                self.zStart = float(value)
+            except ValueError as exp:
+                raise ValueError('Bad float/double attribute (zStart): %s' % exp)
     def buildChildren(self, child_, node, nodeName_, fromsubclass_=False):
         pass
 # end class SpaceStructure
@@ -3631,11 +3379,17 @@ class Layout(GeneratedsSuper):
     subclass = None
     superclass = None
     def __init__(self, space=None, random=None, grid=None, unstructured=None):
+        self.original_tagname_ = None
         self.space = _cast(None, space)
         self.random = random
         self.grid = grid
         self.unstructured = unstructured
     def factory(*args_, **kwargs_):
+        if CurrentSubclassModule_ is not None:
+            subclass = getSubclassFromModule_(
+                CurrentSubclassModule_, Layout)
+            if subclass is not None:
+                return subclass(*args_, **kwargs_)
         if Layout.subclass:
             return Layout.subclass(*args_, **kwargs_)
         else:
@@ -3651,7 +3405,11 @@ class Layout(GeneratedsSuper):
     def set_space(self, space): self.space = space
     def validate_NmlId(self, value):
         # Validate type NmlId, a restriction on xs:string.
-        pass
+        if value is not None and Validate_simpletypes_:
+            if not self.gds_validate_simple_patterns(
+                    self.validate_NmlId_patterns_, value):
+                warnings_.warn('Value "%s" does not match xsd pattern restrictions: %s' % (value.encode('utf-8'), self.validate_NmlId_patterns_, ))
+    validate_NmlId_patterns_ = [['^[a-zA-Z0-9_]*$']]
     def hasContent_(self):
         if (
             self.random is not None or
@@ -3666,13 +3424,15 @@ class Layout(GeneratedsSuper):
             eol_ = '\n'
         else:
             eol_ = ''
+        if self.original_tagname_ is not None:
+            name_ = self.original_tagname_
         showIndent(outfile, level, pretty_print)
         outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
         already_processed = set()
         self.exportAttributes(outfile, level, already_processed, namespace_, name_='Layout')
         if self.hasContent_():
             outfile.write('>%s' % (eol_, ))
-            self.exportChildren(outfile, level + 1, namespace_, name_, pretty_print=pretty_print)
+            self.exportChildren(outfile, level + 1, namespace_='', name_='Layout', pretty_print=pretty_print)
             showIndent(outfile, level, pretty_print)
             outfile.write('</%s%s>%s' % (namespace_, name_, eol_))
         else:
@@ -3692,42 +3452,13 @@ class Layout(GeneratedsSuper):
             self.grid.export(outfile, level, namespace_, name_='grid', pretty_print=pretty_print)
         if self.unstructured is not None:
             self.unstructured.export(outfile, level, namespace_, name_='unstructured', pretty_print=pretty_print)
-    def exportLiteral(self, outfile, level, name_='Layout'):
-        level += 1
-        already_processed = set()
-        self.exportLiteralAttributes(outfile, level, already_processed, name_)
-        if self.hasContent_():
-            self.exportLiteralChildren(outfile, level, name_)
-    def exportLiteralAttributes(self, outfile, level, already_processed, name_):
-        if self.space is not None and 'space' not in already_processed:
-            already_processed.add('space')
-            showIndent(outfile, level)
-            outfile.write('space="%s",\n' % (self.space,))
-    def exportLiteralChildren(self, outfile, level, name_):
-        if self.random is not None:
-            showIndent(outfile, level)
-            outfile.write('random=model_.RandomLayout(\n')
-            self.random.exportLiteral(outfile, level, name_='random')
-            showIndent(outfile, level)
-            outfile.write('),\n')
-        if self.grid is not None:
-            showIndent(outfile, level)
-            outfile.write('grid=model_.GridLayout(\n')
-            self.grid.exportLiteral(outfile, level, name_='grid')
-            showIndent(outfile, level)
-            outfile.write('),\n')
-        if self.unstructured is not None:
-            showIndent(outfile, level)
-            outfile.write('unstructured=model_.UnstructuredLayout(\n')
-            self.unstructured.exportLiteral(outfile, level, name_='unstructured')
-            showIndent(outfile, level)
-            outfile.write('),\n')
     def build(self, node):
         already_processed = set()
         self.buildAttributes(node, node.attrib, already_processed)
         for child in node:
             nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
             self.buildChildren(child, node, nodeName_)
+        return self
     def buildAttributes(self, node, attrs, already_processed):
         value = find_attr_value_('space', node)
         if value is not None and 'space' not in already_processed:
@@ -3738,15 +3469,18 @@ class Layout(GeneratedsSuper):
         if nodeName_ == 'random':
             obj_ = RandomLayout.factory()
             obj_.build(child_)
-            self.set_random(obj_)
+            self.random = obj_
+            obj_.original_tagname_ = 'random'
         elif nodeName_ == 'grid':
             obj_ = GridLayout.factory()
             obj_.build(child_)
-            self.set_grid(obj_)
+            self.grid = obj_
+            obj_.original_tagname_ = 'grid'
         elif nodeName_ == 'unstructured':
             obj_ = UnstructuredLayout.factory()
             obj_.build(child_)
-            self.set_unstructured(obj_)
+            self.unstructured = obj_
+            obj_.original_tagname_ = 'unstructured'
 # end class Layout
 
 
@@ -3754,9 +3488,14 @@ class UnstructuredLayout(GeneratedsSuper):
     subclass = None
     superclass = None
     def __init__(self, number=None):
+        self.original_tagname_ = None
         self.number = _cast(int, number)
-        pass
     def factory(*args_, **kwargs_):
+        if CurrentSubclassModule_ is not None:
+            subclass = getSubclassFromModule_(
+                CurrentSubclassModule_, UnstructuredLayout)
+            if subclass is not None:
+                return subclass(*args_, **kwargs_)
         if UnstructuredLayout.subclass:
             return UnstructuredLayout.subclass(*args_, **kwargs_)
         else:
@@ -3776,13 +3515,15 @@ class UnstructuredLayout(GeneratedsSuper):
             eol_ = '\n'
         else:
             eol_ = ''
+        if self.original_tagname_ is not None:
+            name_ = self.original_tagname_
         showIndent(outfile, level, pretty_print)
         outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
         already_processed = set()
         self.exportAttributes(outfile, level, already_processed, namespace_, name_='UnstructuredLayout')
         if self.hasContent_():
             outfile.write('>%s' % (eol_, ))
-            self.exportChildren(outfile, level + 1, namespace_, name_, pretty_print=pretty_print)
+            self.exportChildren(outfile, level + 1, namespace_='', name_='UnstructuredLayout', pretty_print=pretty_print)
             outfile.write('</%s%s>%s' % (namespace_, name_, eol_))
         else:
             outfile.write('/>%s' % (eol_, ))
@@ -3792,25 +3533,13 @@ class UnstructuredLayout(GeneratedsSuper):
             outfile.write(' number="%s"' % self.gds_format_integer(self.number, input_name='number'))
     def exportChildren(self, outfile, level, namespace_='', name_='UnstructuredLayout', fromsubclass_=False, pretty_print=True):
         pass
-    def exportLiteral(self, outfile, level, name_='UnstructuredLayout'):
-        level += 1
-        already_processed = set()
-        self.exportLiteralAttributes(outfile, level, already_processed, name_)
-        if self.hasContent_():
-            self.exportLiteralChildren(outfile, level, name_)
-    def exportLiteralAttributes(self, outfile, level, already_processed, name_):
-        if self.number is not None and 'number' not in already_processed:
-            already_processed.add('number')
-            showIndent(outfile, level)
-            outfile.write('number=%d,\n' % (self.number,))
-    def exportLiteralChildren(self, outfile, level, name_):
-        pass
     def build(self, node):
         already_processed = set()
         self.buildAttributes(node, node.attrib, already_processed)
         for child in node:
             nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
             self.buildChildren(child, node, nodeName_)
+        return self
     def buildAttributes(self, node, attrs, already_processed):
         value = find_attr_value_('number', node)
         if value is not None and 'number' not in already_processed:
@@ -3829,23 +3558,32 @@ class UnstructuredLayout(GeneratedsSuper):
 class RandomLayout(GeneratedsSuper):
     subclass = None
     superclass = None
-    def __init__(self, region=None, number=None):
-        self.region = _cast(None, region)
+    def __init__(self, number=None, region=None):
+        self.original_tagname_ = None
         self.number = _cast(int, number)
-        pass
+        self.region = _cast(None, region)
     def factory(*args_, **kwargs_):
+        if CurrentSubclassModule_ is not None:
+            subclass = getSubclassFromModule_(
+                CurrentSubclassModule_, RandomLayout)
+            if subclass is not None:
+                return subclass(*args_, **kwargs_)
         if RandomLayout.subclass:
             return RandomLayout.subclass(*args_, **kwargs_)
         else:
             return RandomLayout(*args_, **kwargs_)
     factory = staticmethod(factory)
+    def get_number(self): return self.number
+    def set_number(self, number): self.number = number
     def get_region(self): return self.region
     def set_region(self, region): self.region = region
     def validate_NmlId(self, value):
         # Validate type NmlId, a restriction on xs:string.
-        pass
-    def get_number(self): return self.number
-    def set_number(self, number): self.number = number
+        if value is not None and Validate_simpletypes_:
+            if not self.gds_validate_simple_patterns(
+                    self.validate_NmlId_patterns_, value):
+                warnings_.warn('Value "%s" does not match xsd pattern restrictions: %s' % (value.encode('utf-8'), self.validate_NmlId_patterns_, ))
+    validate_NmlId_patterns_ = [['^[a-zA-Z0-9_]*$']]
     def hasContent_(self):
         if (
 
@@ -3858,41 +3596,26 @@ class RandomLayout(GeneratedsSuper):
             eol_ = '\n'
         else:
             eol_ = ''
+        if self.original_tagname_ is not None:
+            name_ = self.original_tagname_
         showIndent(outfile, level, pretty_print)
         outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
         already_processed = set()
         self.exportAttributes(outfile, level, already_processed, namespace_, name_='RandomLayout')
         if self.hasContent_():
             outfile.write('>%s' % (eol_, ))
-            self.exportChildren(outfile, level + 1, namespace_, name_, pretty_print=pretty_print)
+            self.exportChildren(outfile, level + 1, namespace_='', name_='RandomLayout', pretty_print=pretty_print)
             outfile.write('</%s%s>%s' % (namespace_, name_, eol_))
         else:
             outfile.write('/>%s' % (eol_, ))
     def exportAttributes(self, outfile, level, already_processed, namespace_='', name_='RandomLayout'):
-        if self.region is not None and 'region' not in already_processed:
-            already_processed.add('region')
-            outfile.write(' region=%s' % (quote_attrib(self.region), ))
         if self.number is not None and 'number' not in already_processed:
             already_processed.add('number')
             outfile.write(' number="%s"' % self.gds_format_integer(self.number, input_name='number'))
-    def exportChildren(self, outfile, level, namespace_='', name_='RandomLayout', fromsubclass_=False, pretty_print=True):
-        pass
-    def exportLiteral(self, outfile, level, name_='RandomLayout'):
-        level += 1
-        already_processed = set()
-        self.exportLiteralAttributes(outfile, level, already_processed, name_)
-        if self.hasContent_():
-            self.exportLiteralChildren(outfile, level, name_)
-    def exportLiteralAttributes(self, outfile, level, already_processed, name_):
         if self.region is not None and 'region' not in already_processed:
             already_processed.add('region')
-            showIndent(outfile, level)
-            outfile.write('region="%s",\n' % (self.region,))
-        if self.number is not None and 'number' not in already_processed:
-            already_processed.add('number')
-            showIndent(outfile, level)
-            outfile.write('number=%d,\n' % (self.number,))
-    def exportLiteralChildren(self, outfile, level, name_):
+            outfile.write(' region=%s' % (quote_attrib(self.region), ))
+    def exportChildren(self, outfile, level, namespace_='', name_='RandomLayout', fromsubclass_=False, pretty_print=True):
         pass
     def build(self, node):
         already_processed = set()
@@ -3900,12 +3623,8 @@ class RandomLayout(GeneratedsSuper):
         for child in node:
             nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
             self.buildChildren(child, node, nodeName_)
+        return self
     def buildAttributes(self, node, attrs, already_processed):
-        value = find_attr_value_('region', node)
-        if value is not None and 'region' not in already_processed:
-            already_processed.add('region')
-            self.region = value
-            self.validate_NmlId(self.region)    # validate type NmlId
         value = find_attr_value_('number', node)
         if value is not None and 'number' not in already_processed:
             already_processed.add('number')
@@ -3915,6 +3634,11 @@ class RandomLayout(GeneratedsSuper):
                 raise_parse_error(node, 'Bad integer attribute: %s' % exp)
             if self.number < 0:
                 raise_parse_error(node, 'Invalid NonNegativeInteger')
+        value = find_attr_value_('region', node)
+        if value is not None and 'region' not in already_processed:
+            already_processed.add('region')
+            self.region = value
+            self.validate_NmlId(self.region)    # validate type NmlId
     def buildChildren(self, child_, node, nodeName_, fromsubclass_=False):
         pass
 # end class RandomLayout
@@ -3923,25 +3647,30 @@ class RandomLayout(GeneratedsSuper):
 class GridLayout(GeneratedsSuper):
     subclass = None
     superclass = None
-    def __init__(self, zSize=None, ySize=None, xSize=None):
-        self.zSize = _cast(int, zSize)
-        self.ySize = _cast(int, ySize)
+    def __init__(self, xSize=None, ySize=None, zSize=None):
+        self.original_tagname_ = None
         self.xSize = _cast(int, xSize)
-        pass
+        self.ySize = _cast(int, ySize)
+        self.zSize = _cast(int, zSize)
     def factory(*args_, **kwargs_):
+        if CurrentSubclassModule_ is not None:
+            subclass = getSubclassFromModule_(
+                CurrentSubclassModule_, GridLayout)
+            if subclass is not None:
+                return subclass(*args_, **kwargs_)
         if GridLayout.subclass:
             return GridLayout.subclass(*args_, **kwargs_)
         else:
             return GridLayout(*args_, **kwargs_)
     factory = staticmethod(factory)
-    def get_zSize(self): return self.zSize
-    def set_zSize(self, zSize): self.zSize = zSize
-    def get_ySize(self): return self.ySize
-    def set_ySize(self, ySize): self.ySize = ySize
     def get_xSize(self): return self.xSize
     def set_xSize(self, xSize): self.xSize = xSize
-    def hasContent_(self):
-        if (
+    def get_ySize(self): return self.ySize
+    def set_ySize(self, ySize): self.ySize = ySize
+    def get_zSize(self): return self.zSize
+    def set_zSize(self, zSize): self.zSize = zSize
+    def hasContent_(self):
+        if (
 
         ):
             return True
@@ -3952,48 +3681,29 @@ class GridLayout(GeneratedsSuper):
             eol_ = '\n'
         else:
             eol_ = ''
+        if self.original_tagname_ is not None:
+            name_ = self.original_tagname_
         showIndent(outfile, level, pretty_print)
         outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
         already_processed = set()
         self.exportAttributes(outfile, level, already_processed, namespace_, name_='GridLayout')
         if self.hasContent_():
             outfile.write('>%s' % (eol_, ))
-            self.exportChildren(outfile, level + 1, namespace_, name_, pretty_print=pretty_print)
+            self.exportChildren(outfile, level + 1, namespace_='', name_='GridLayout', pretty_print=pretty_print)
             outfile.write('</%s%s>%s' % (namespace_, name_, eol_))
         else:
             outfile.write('/>%s' % (eol_, ))
     def exportAttributes(self, outfile, level, already_processed, namespace_='', name_='GridLayout'):
-        if self.zSize is not None and 'zSize' not in already_processed:
-            already_processed.add('zSize')
-            outfile.write(' zSize="%s"' % self.gds_format_integer(self.zSize, input_name='zSize'))
-        if self.ySize is not None and 'ySize' not in already_processed:
-            already_processed.add('ySize')
-            outfile.write(' ySize="%s"' % self.gds_format_integer(self.ySize, input_name='ySize'))
         if self.xSize is not None and 'xSize' not in already_processed:
             already_processed.add('xSize')
             outfile.write(' xSize="%s"' % self.gds_format_integer(self.xSize, input_name='xSize'))
-    def exportChildren(self, outfile, level, namespace_='', name_='GridLayout', fromsubclass_=False, pretty_print=True):
-        pass
-    def exportLiteral(self, outfile, level, name_='GridLayout'):
-        level += 1
-        already_processed = set()
-        self.exportLiteralAttributes(outfile, level, already_processed, name_)
-        if self.hasContent_():
-            self.exportLiteralChildren(outfile, level, name_)
-    def exportLiteralAttributes(self, outfile, level, already_processed, name_):
-        if self.zSize is not None and 'zSize' not in already_processed:
-            already_processed.add('zSize')
-            showIndent(outfile, level)
-            outfile.write('zSize=%d,\n' % (self.zSize,))
         if self.ySize is not None and 'ySize' not in already_processed:
             already_processed.add('ySize')
-            showIndent(outfile, level)
-            outfile.write('ySize=%d,\n' % (self.ySize,))
-        if self.xSize is not None and 'xSize' not in already_processed:
-            already_processed.add('xSize')
-            showIndent(outfile, level)
-            outfile.write('xSize=%d,\n' % (self.xSize,))
-    def exportLiteralChildren(self, outfile, level, name_):
+            outfile.write(' ySize="%s"' % self.gds_format_integer(self.ySize, input_name='ySize'))
+        if self.zSize is not None and 'zSize' not in already_processed:
+            already_processed.add('zSize')
+            outfile.write(' zSize="%s"' % self.gds_format_integer(self.zSize, input_name='zSize'))
+    def exportChildren(self, outfile, level, namespace_='', name_='GridLayout', fromsubclass_=False, pretty_print=True):
         pass
     def build(self, node):
         already_processed = set()
@@ -4001,15 +3711,16 @@ class GridLayout(GeneratedsSuper):
         for child in node:
             nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
             self.buildChildren(child, node, nodeName_)
+        return self
     def buildAttributes(self, node, attrs, already_processed):
-        value = find_attr_value_('zSize', node)
-        if value is not None and 'zSize' not in already_processed:
-            already_processed.add('zSize')
+        value = find_attr_value_('xSize', node)
+        if value is not None and 'xSize' not in already_processed:
+            already_processed.add('xSize')
             try:
-                self.zSize = int(value)
+                self.xSize = int(value)
             except ValueError as exp:
                 raise_parse_error(node, 'Bad integer attribute: %s' % exp)
-            if self.zSize < 0:
+            if self.xSize < 0:
                 raise_parse_error(node, 'Invalid NonNegativeInteger')
         value = find_attr_value_('ySize', node)
         if value is not None and 'ySize' not in already_processed:
@@ -4020,14 +3731,14 @@ class GridLayout(GeneratedsSuper):
                 raise_parse_error(node, 'Bad integer attribute: %s' % exp)
             if self.ySize < 0:
                 raise_parse_error(node, 'Invalid NonNegativeInteger')
-        value = find_attr_value_('xSize', node)
-        if value is not None and 'xSize' not in already_processed:
-            already_processed.add('xSize')
+        value = find_attr_value_('zSize', node)
+        if value is not None and 'zSize' not in already_processed:
+            already_processed.add('zSize')
             try:
-                self.xSize = int(value)
+                self.zSize = int(value)
             except ValueError as exp:
                 raise_parse_error(node, 'Bad integer attribute: %s' % exp)
-            if self.xSize < 0:
+            if self.zSize < 0:
                 raise_parse_error(node, 'Invalid NonNegativeInteger')
     def buildChildren(self, child_, node, nodeName_, fromsubclass_=False):
         pass
@@ -4037,13 +3748,19 @@ class GridLayout(GeneratedsSuper):
 class Instance(GeneratedsSuper):
     subclass = None
     superclass = None
-    def __init__(self, i=None, k=None, j=None, id=None, location=None):
+    def __init__(self, id=None, i=None, j=None, k=None, location=None):
+        self.original_tagname_ = None
+        self.id = _cast(int, id)
         self.i = _cast(int, i)
-        self.k = _cast(int, k)
         self.j = _cast(int, j)
-        self.id = _cast(int, id)
+        self.k = _cast(int, k)
         self.location = location
     def factory(*args_, **kwargs_):
+        if CurrentSubclassModule_ is not None:
+            subclass = getSubclassFromModule_(
+                CurrentSubclassModule_, Instance)
+            if subclass is not None:
+                return subclass(*args_, **kwargs_)
         if Instance.subclass:
             return Instance.subclass(*args_, **kwargs_)
         else:
@@ -4051,14 +3768,14 @@ class Instance(GeneratedsSuper):
     factory = staticmethod(factory)
     def get_location(self): return self.location
     def set_location(self, location): self.location = location
+    def get_id(self): return self.id
+    def set_id(self, id): self.id = id
     def get_i(self): return self.i
     def set_i(self, i): self.i = i
-    def get_k(self): return self.k
-    def set_k(self, k): self.k = k
     def get_j(self): return self.j
     def set_j(self, j): self.j = j
-    def get_id(self): return self.id
-    def set_id(self, id): self.id = id
+    def get_k(self): return self.k
+    def set_k(self, k): self.k = k
     def hasContent_(self):
         if (
             self.location is not None
@@ -4071,30 +3788,32 @@ class Instance(GeneratedsSuper):
             eol_ = '\n'
         else:
             eol_ = ''
+        if self.original_tagname_ is not None:
+            name_ = self.original_tagname_
         showIndent(outfile, level, pretty_print)
         outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
         already_processed = set()
         self.exportAttributes(outfile, level, already_processed, namespace_, name_='Instance')
         if self.hasContent_():
             outfile.write('>%s' % (eol_, ))
-            self.exportChildren(outfile, level + 1, namespace_, name_, pretty_print=pretty_print)
+            self.exportChildren(outfile, level + 1, namespace_='', name_='Instance', pretty_print=pretty_print)
             showIndent(outfile, level, pretty_print)
             outfile.write('</%s%s>%s' % (namespace_, name_, eol_))
         else:
             outfile.write('/>%s' % (eol_, ))
     def exportAttributes(self, outfile, level, already_processed, namespace_='', name_='Instance'):
+        if self.id is not None and 'id' not in already_processed:
+            already_processed.add('id')
+            outfile.write(' id="%s"' % self.gds_format_integer(self.id, input_name='id'))
         if self.i is not None and 'i' not in already_processed:
             already_processed.add('i')
             outfile.write(' i="%s"' % self.gds_format_integer(self.i, input_name='i'))
-        if self.k is not None and 'k' not in already_processed:
-            already_processed.add('k')
-            outfile.write(' k="%s"' % self.gds_format_integer(self.k, input_name='k'))
         if self.j is not None and 'j' not in already_processed:
             already_processed.add('j')
             outfile.write(' j="%s"' % self.gds_format_integer(self.j, input_name='j'))
-        if self.id is not None and 'id' not in already_processed:
-            already_processed.add('id')
-            outfile.write(' id="%s"' % self.gds_format_integer(self.id, input_name='id'))
+        if self.k is not None and 'k' not in already_processed:
+            already_processed.add('k')
+            outfile.write(' k="%s"' % self.gds_format_integer(self.k, input_name='k'))
     def exportChildren(self, outfile, level, namespace_='', name_='Instance', fromsubclass_=False, pretty_print=True):
         if pretty_print:
             eol_ = '\n'
@@ -4102,43 +3821,23 @@ class Instance(GeneratedsSuper):
             eol_ = ''
         if self.location is not None:
             self.location.export(outfile, level, namespace_, name_='location', pretty_print=pretty_print)
-    def exportLiteral(self, outfile, level, name_='Instance'):
-        level += 1
-        already_processed = set()
-        self.exportLiteralAttributes(outfile, level, already_processed, name_)
-        if self.hasContent_():
-            self.exportLiteralChildren(outfile, level, name_)
-    def exportLiteralAttributes(self, outfile, level, already_processed, name_):
-        if self.i is not None and 'i' not in already_processed:
-            already_processed.add('i')
-            showIndent(outfile, level)
-            outfile.write('i=%d,\n' % (self.i,))
-        if self.k is not None and 'k' not in already_processed:
-            already_processed.add('k')
-            showIndent(outfile, level)
-            outfile.write('k=%d,\n' % (self.k,))
-        if self.j is not None and 'j' not in already_processed:
-            already_processed.add('j')
-            showIndent(outfile, level)
-            outfile.write('j=%d,\n' % (self.j,))
-        if self.id is not None and 'id' not in already_processed:
-            already_processed.add('id')
-            showIndent(outfile, level)
-            outfile.write('id=%d,\n' % (self.id,))
-    def exportLiteralChildren(self, outfile, level, name_):
-        if self.location is not None:
-            showIndent(outfile, level)
-            outfile.write('location=model_.Location(\n')
-            self.location.exportLiteral(outfile, level, name_='location')
-            showIndent(outfile, level)
-            outfile.write('),\n')
     def build(self, node):
         already_processed = set()
         self.buildAttributes(node, node.attrib, already_processed)
         for child in node:
             nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
             self.buildChildren(child, node, nodeName_)
+        return self
     def buildAttributes(self, node, attrs, already_processed):
+        value = find_attr_value_('id', node)
+        if value is not None and 'id' not in already_processed:
+            already_processed.add('id')
+            try:
+                self.id = int(value)
+            except ValueError as exp:
+                raise_parse_error(node, 'Bad integer attribute: %s' % exp)
+            if self.id < 0:
+                raise_parse_error(node, 'Invalid NonNegativeInteger')
         value = find_attr_value_('i', node)
         if value is not None and 'i' not in already_processed:
             already_processed.add('i')
@@ -4148,15 +3847,6 @@ class Instance(GeneratedsSuper):
                 raise_parse_error(node, 'Bad integer attribute: %s' % exp)
             if self.i < 0:
                 raise_parse_error(node, 'Invalid NonNegativeInteger')
-        value = find_attr_value_('k', node)
-        if value is not None and 'k' not in already_processed:
-            already_processed.add('k')
-            try:
-                self.k = int(value)
-            except ValueError as exp:
-                raise_parse_error(node, 'Bad integer attribute: %s' % exp)
-            if self.k < 0:
-                raise_parse_error(node, 'Invalid NonNegativeInteger')
         value = find_attr_value_('j', node)
         if value is not None and 'j' not in already_processed:
             already_processed.add('j')
@@ -4166,41 +3856,47 @@ class Instance(GeneratedsSuper):
                 raise_parse_error(node, 'Bad integer attribute: %s' % exp)
             if self.j < 0:
                 raise_parse_error(node, 'Invalid NonNegativeInteger')
-        value = find_attr_value_('id', node)
-        if value is not None and 'id' not in already_processed:
-            already_processed.add('id')
+        value = find_attr_value_('k', node)
+        if value is not None and 'k' not in already_processed:
+            already_processed.add('k')
             try:
-                self.id = int(value)
+                self.k = int(value)
             except ValueError as exp:
                 raise_parse_error(node, 'Bad integer attribute: %s' % exp)
-            if self.id < 0:
+            if self.k < 0:
                 raise_parse_error(node, 'Invalid NonNegativeInteger')
     def buildChildren(self, child_, node, nodeName_, fromsubclass_=False):
         if nodeName_ == 'location':
             obj_ = Location.factory()
             obj_.build(child_)
-            self.set_location(obj_)
+            self.location = obj_
+            obj_.original_tagname_ = 'location'
 # end class Instance
 
 
 class Location(GeneratedsSuper):
     subclass = None
     superclass = None
-    def __init__(self, y=None, x=None, z=None):
-        self.y = _cast(float, y)
+    def __init__(self, x=None, y=None, z=None):
+        self.original_tagname_ = None
         self.x = _cast(float, x)
+        self.y = _cast(float, y)
         self.z = _cast(float, z)
-        pass
     def factory(*args_, **kwargs_):
+        if CurrentSubclassModule_ is not None:
+            subclass = getSubclassFromModule_(
+                CurrentSubclassModule_, Location)
+            if subclass is not None:
+                return subclass(*args_, **kwargs_)
         if Location.subclass:
             return Location.subclass(*args_, **kwargs_)
         else:
             return Location(*args_, **kwargs_)
     factory = staticmethod(factory)
-    def get_y(self): return self.y
-    def set_y(self, y): self.y = y
     def get_x(self): return self.x
     def set_x(self, x): self.x = x
+    def get_y(self): return self.y
+    def set_y(self, y): self.y = y
     def get_z(self): return self.z
     def set_z(self, z): self.z = z
     def hasContent_(self):
@@ -4215,48 +3911,29 @@ class Location(GeneratedsSuper):
             eol_ = '\n'
         else:
             eol_ = ''
+        if self.original_tagname_ is not None:
+            name_ = self.original_tagname_
         showIndent(outfile, level, pretty_print)
         outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
         already_processed = set()
         self.exportAttributes(outfile, level, already_processed, namespace_, name_='Location')
         if self.hasContent_():
             outfile.write('>%s' % (eol_, ))
-            self.exportChildren(outfile, level + 1, namespace_, name_, pretty_print=pretty_print)
+            self.exportChildren(outfile, level + 1, namespace_='', name_='Location', pretty_print=pretty_print)
             outfile.write('</%s%s>%s' % (namespace_, name_, eol_))
         else:
             outfile.write('/>%s' % (eol_, ))
     def exportAttributes(self, outfile, level, already_processed, namespace_='', name_='Location'):
-        if self.y is not None and 'y' not in already_processed:
-            already_processed.add('y')
-            outfile.write(' y="%s"' % self.gds_format_float(self.y, input_name='y'))
         if self.x is not None and 'x' not in already_processed:
             already_processed.add('x')
             outfile.write(' x="%s"' % self.gds_format_float(self.x, input_name='x'))
-        if self.z is not None and 'z' not in already_processed:
-            already_processed.add('z')
-            outfile.write(' z="%s"' % self.gds_format_float(self.z, input_name='z'))
-    def exportChildren(self, outfile, level, namespace_='', name_='Location', fromsubclass_=False, pretty_print=True):
-        pass
-    def exportLiteral(self, outfile, level, name_='Location'):
-        level += 1
-        already_processed = set()
-        self.exportLiteralAttributes(outfile, level, already_processed, name_)
-        if self.hasContent_():
-            self.exportLiteralChildren(outfile, level, name_)
-    def exportLiteralAttributes(self, outfile, level, already_processed, name_):
         if self.y is not None and 'y' not in already_processed:
             already_processed.add('y')
-            showIndent(outfile, level)
-            outfile.write('y=%f,\n' % (self.y,))
-        if self.x is not None and 'x' not in already_processed:
-            already_processed.add('x')
-            showIndent(outfile, level)
-            outfile.write('x=%f,\n' % (self.x,))
+            outfile.write(' y="%s"' % self.gds_format_float(self.y, input_name='y'))
         if self.z is not None and 'z' not in already_processed:
             already_processed.add('z')
-            showIndent(outfile, level)
-            outfile.write('z=%f,\n' % (self.z,))
-    def exportLiteralChildren(self, outfile, level, name_):
+            outfile.write(' z="%s"' % self.gds_format_float(self.z, input_name='z'))
+    def exportChildren(self, outfile, level, namespace_='', name_='Location', fromsubclass_=False, pretty_print=True):
         pass
     def build(self, node):
         already_processed = set()
@@ -4264,14 +3941,8 @@ class Location(GeneratedsSuper):
         for child in node:
             nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
             self.buildChildren(child, node, nodeName_)
+        return self
     def buildAttributes(self, node, attrs, already_processed):
-        value = find_attr_value_('y', node)
-        if value is not None and 'y' not in already_processed:
-            already_processed.add('y')
-            try:
-                self.y = float(value)
-            except ValueError as exp:
-                raise ValueError('Bad float/double attribute (y): %s' % exp)
         value = find_attr_value_('x', node)
         if value is not None and 'x' not in already_processed:
             already_processed.add('x')
@@ -4279,6 +3950,13 @@ class Location(GeneratedsSuper):
                 self.x = float(value)
             except ValueError as exp:
                 raise ValueError('Bad float/double attribute (x): %s' % exp)
+        value = find_attr_value_('y', node)
+        if value is not None and 'y' not in already_processed:
+            already_processed.add('y')
+            try:
+                self.y = float(value)
+            except ValueError as exp:
+                raise ValueError('Bad float/double attribute (y): %s' % exp)
         value = find_attr_value_('z', node)
         if value is not None and 'z' not in already_processed:
             already_processed.add('z')
@@ -4297,23 +3975,28 @@ class SynapticConnection(GeneratedsSuper):
     projection element"""
     subclass = None
     superclass = None
-    def __init__(self, to=None, synapse=None, fromxx=None):
+    def __init__(self, from_=None, to=None, synapse=None):
+        self.original_tagname_ = None
+        self.from_ = _cast(None, from_)
         self.to = _cast(None, to)
         self.synapse = _cast(None, synapse)
-        self.fromxx = _cast(None, fromxx)
-        pass
     def factory(*args_, **kwargs_):
+        if CurrentSubclassModule_ is not None:
+            subclass = getSubclassFromModule_(
+                CurrentSubclassModule_, SynapticConnection)
+            if subclass is not None:
+                return subclass(*args_, **kwargs_)
         if SynapticConnection.subclass:
             return SynapticConnection.subclass(*args_, **kwargs_)
         else:
             return SynapticConnection(*args_, **kwargs_)
     factory = staticmethod(factory)
+    def get_from(self): return self.from_
+    def set_from(self, from_): self.from_ = from_
     def get_to(self): return self.to
     def set_to(self, to): self.to = to
     def get_synapse(self): return self.synapse
     def set_synapse(self, synapse): self.synapse = synapse
-    def get_from(self): return self.fromxx
-    def set_from(self, fromxx): self.fromxx = fromxx
     def hasContent_(self):
         if (
 
@@ -4326,56 +4009,42 @@ class SynapticConnection(GeneratedsSuper):
             eol_ = '\n'
         else:
             eol_ = ''
+        if self.original_tagname_ is not None:
+            name_ = self.original_tagname_
         showIndent(outfile, level, pretty_print)
         outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
         already_processed = set()
         self.exportAttributes(outfile, level, already_processed, namespace_, name_='SynapticConnection')
         if self.hasContent_():
             outfile.write('>%s' % (eol_, ))
-            self.exportChildren(outfile, level + 1, namespace_, name_, pretty_print=pretty_print)
+            self.exportChildren(outfile, level + 1, namespace_='', name_='SynapticConnection', pretty_print=pretty_print)
             outfile.write('</%s%s>%s' % (namespace_, name_, eol_))
         else:
             outfile.write('/>%s' % (eol_, ))
     def exportAttributes(self, outfile, level, already_processed, namespace_='', name_='SynapticConnection'):
+        if self.from_ is not None and 'from_' not in already_processed:
+            already_processed.add('from_')
+            outfile.write(' from=%s' % (self.gds_encode(self.gds_format_string(quote_attrib(self.from_), input_name='from')), ))
         if self.to is not None and 'to' not in already_processed:
             already_processed.add('to')
-            outfile.write(' to=%s' % (self.gds_format_string(quote_attrib(self.to).encode(ExternalEncoding), input_name='to'), ))
+            outfile.write(' to=%s' % (self.gds_encode(self.gds_format_string(quote_attrib(self.to), input_name='to')), ))
         if self.synapse is not None and 'synapse' not in already_processed:
             already_processed.add('synapse')
-            outfile.write(' synapse=%s' % (self.gds_format_string(quote_attrib(self.synapse).encode(ExternalEncoding), input_name='synapse'), ))
-        if self.fromxx is not None and 'fromxx' not in already_processed:
-            already_processed.add('fromxx')
-            outfile.write(' from=%s' % (self.gds_format_string(quote_attrib(self.fromxx).encode(ExternalEncoding), input_name='from'), ))
+            outfile.write(' synapse=%s' % (self.gds_encode(self.gds_format_string(quote_attrib(self.synapse), input_name='synapse')), ))
     def exportChildren(self, outfile, level, namespace_='', name_='SynapticConnection', fromsubclass_=False, pretty_print=True):
         pass
-    def exportLiteral(self, outfile, level, name_='SynapticConnection'):
-        level += 1
-        already_processed = set()
-        self.exportLiteralAttributes(outfile, level, already_processed, name_)
-        if self.hasContent_():
-            self.exportLiteralChildren(outfile, level, name_)
-    def exportLiteralAttributes(self, outfile, level, already_processed, name_):
-        if self.to is not None and 'to' not in already_processed:
-            already_processed.add('to')
-            showIndent(outfile, level)
-            outfile.write('to="%s",\n' % (self.to,))
-        if self.synapse is not None and 'synapse' not in already_processed:
-            already_processed.add('synapse')
-            showIndent(outfile, level)
-            outfile.write('synapse="%s",\n' % (self.synapse,))
-        if self.fromxx is not None and 'fromxx' not in already_processed:
-            already_processed.add('fromxx')
-            showIndent(outfile, level)
-            outfile.write('fromxx="%s",\n' % (self.fromxx,))
-    def exportLiteralChildren(self, outfile, level, name_):
-        pass
     def build(self, node):
         already_processed = set()
         self.buildAttributes(node, node.attrib, already_processed)
         for child in node:
             nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
             self.buildChildren(child, node, nodeName_)
+        return self
     def buildAttributes(self, node, attrs, already_processed):
+        value = find_attr_value_('from', node)
+        if value is not None and 'from' not in already_processed:
+            already_processed.add('from')
+            self.from_ = value
         value = find_attr_value_('to', node)
         if value is not None and 'to' not in already_processed:
             already_processed.add('to')
@@ -4384,10 +4053,6 @@ class SynapticConnection(GeneratedsSuper):
         if value is not None and 'synapse' not in already_processed:
             already_processed.add('synapse')
             self.synapse = value
-        value = find_attr_value_('from', node)
-        if value is not None and 'from' not in already_processed:
-            already_processed.add('from')
-            self.fromxx = value
     def buildChildren(self, child_, node, nodeName_, fromsubclass_=False):
         pass
 # end class SynapticConnection
@@ -4397,23 +4062,28 @@ class Connection(GeneratedsSuper):
     """Subject to change as it gets tested with LEMS"""
     subclass = None
     superclass = None
-    def __init__(self, postCellId=None, id=None, preCellId=None):
-        self.postCellId = _cast(None, postCellId)
+    def __init__(self, id=None, preCellId=None, postCellId=None):
+        self.original_tagname_ = None
         self.id = _cast(int, id)
         self.preCellId = _cast(None, preCellId)
-        pass
+        self.postCellId = _cast(None, postCellId)
     def factory(*args_, **kwargs_):
+        if CurrentSubclassModule_ is not None:
+            subclass = getSubclassFromModule_(
+                CurrentSubclassModule_, Connection)
+            if subclass is not None:
+                return subclass(*args_, **kwargs_)
         if Connection.subclass:
             return Connection.subclass(*args_, **kwargs_)
         else:
             return Connection(*args_, **kwargs_)
     factory = staticmethod(factory)
-    def get_postCellId(self): return self.postCellId
-    def set_postCellId(self, postCellId): self.postCellId = postCellId
     def get_id(self): return self.id
     def set_id(self, id): self.id = id
     def get_preCellId(self): return self.preCellId
     def set_preCellId(self, preCellId): self.preCellId = preCellId
+    def get_postCellId(self): return self.postCellId
+    def set_postCellId(self, postCellId): self.postCellId = postCellId
     def hasContent_(self):
         if (
 
@@ -4426,48 +4096,29 @@ class Connection(GeneratedsSuper):
             eol_ = '\n'
         else:
             eol_ = ''
+        if self.original_tagname_ is not None:
+            name_ = self.original_tagname_
         showIndent(outfile, level, pretty_print)
         outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
         already_processed = set()
         self.exportAttributes(outfile, level, already_processed, namespace_, name_='Connection')
         if self.hasContent_():
             outfile.write('>%s' % (eol_, ))
-            self.exportChildren(outfile, level + 1, namespace_, name_, pretty_print=pretty_print)
+            self.exportChildren(outfile, level + 1, namespace_='', name_='Connection', pretty_print=pretty_print)
             outfile.write('</%s%s>%s' % (namespace_, name_, eol_))
         else:
             outfile.write('/>%s' % (eol_, ))
     def exportAttributes(self, outfile, level, already_processed, namespace_='', name_='Connection'):
-        if self.postCellId is not None and 'postCellId' not in already_processed:
-            already_processed.add('postCellId')
-            outfile.write(' postCellId=%s' % (self.gds_format_string(quote_attrib(self.postCellId).encode(ExternalEncoding), input_name='postCellId'), ))
         if self.id is not None and 'id' not in already_processed:
             already_processed.add('id')
             outfile.write(' id="%s"' % self.gds_format_integer(self.id, input_name='id'))
         if self.preCellId is not None and 'preCellId' not in already_processed:
             already_processed.add('preCellId')
-            outfile.write(' preCellId=%s' % (self.gds_format_string(quote_attrib(self.preCellId).encode(ExternalEncoding), input_name='preCellId'), ))
-    def exportChildren(self, outfile, level, namespace_='', name_='Connection', fromsubclass_=False, pretty_print=True):
-        pass
-    def exportLiteral(self, outfile, level, name_='Connection'):
-        level += 1
-        already_processed = set()
-        self.exportLiteralAttributes(outfile, level, already_processed, name_)
-        if self.hasContent_():
-            self.exportLiteralChildren(outfile, level, name_)
-    def exportLiteralAttributes(self, outfile, level, already_processed, name_):
+            outfile.write(' preCellId=%s' % (self.gds_encode(self.gds_format_string(quote_attrib(self.preCellId), input_name='preCellId')), ))
         if self.postCellId is not None and 'postCellId' not in already_processed:
             already_processed.add('postCellId')
-            showIndent(outfile, level)
-            outfile.write('postCellId="%s",\n' % (self.postCellId,))
-        if self.id is not None and 'id' not in already_processed:
-            already_processed.add('id')
-            showIndent(outfile, level)
-            outfile.write('id=%d,\n' % (self.id,))
-        if self.preCellId is not None and 'preCellId' not in already_processed:
-            already_processed.add('preCellId')
-            showIndent(outfile, level)
-            outfile.write('preCellId="%s",\n' % (self.preCellId,))
-    def exportLiteralChildren(self, outfile, level, name_):
+            outfile.write(' postCellId=%s' % (self.gds_encode(self.gds_format_string(quote_attrib(self.postCellId), input_name='postCellId')), ))
+    def exportChildren(self, outfile, level, namespace_='', name_='Connection', fromsubclass_=False, pretty_print=True):
         pass
     def build(self, node):
         already_processed = set()
@@ -4475,11 +4126,8 @@ class Connection(GeneratedsSuper):
         for child in node:
             nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
             self.buildChildren(child, node, nodeName_)
+        return self
     def buildAttributes(self, node, attrs, already_processed):
-        value = find_attr_value_('postCellId', node)
-        if value is not None and 'postCellId' not in already_processed:
-            already_processed.add('postCellId')
-            self.postCellId = value
         value = find_attr_value_('id', node)
         if value is not None and 'id' not in already_processed:
             already_processed.add('id')
@@ -4493,6 +4141,10 @@ class Connection(GeneratedsSuper):
         if value is not None and 'preCellId' not in already_processed:
             already_processed.add('preCellId')
             self.preCellId = value
+        value = find_attr_value_('postCellId', node)
+        if value is not None and 'postCellId' not in already_processed:
+            already_processed.add('postCellId')
+            self.postCellId = value
     def buildChildren(self, child_, node, nodeName_, fromsubclass_=False):
         pass
 # end class Connection
@@ -4504,23 +4156,28 @@ class ExplicitInput(GeneratedsSuper):
     element"""
     subclass = None
     superclass = None
-    def __init__(self, input=None, destination=None, target=None):
+    def __init__(self, target=None, input=None, destination=None):
+        self.original_tagname_ = None
+        self.target = _cast(None, target)
         self.input = _cast(None, input)
         self.destination = _cast(None, destination)
-        self.target = _cast(None, target)
-        pass
     def factory(*args_, **kwargs_):
+        if CurrentSubclassModule_ is not None:
+            subclass = getSubclassFromModule_(
+                CurrentSubclassModule_, ExplicitInput)
+            if subclass is not None:
+                return subclass(*args_, **kwargs_)
         if ExplicitInput.subclass:
             return ExplicitInput.subclass(*args_, **kwargs_)
         else:
             return ExplicitInput(*args_, **kwargs_)
     factory = staticmethod(factory)
+    def get_target(self): return self.target
+    def set_target(self, target): self.target = target
     def get_input(self): return self.input
     def set_input(self, input): self.input = input
     def get_destination(self): return self.destination
     def set_destination(self, destination): self.destination = destination
-    def get_target(self): return self.target
-    def set_target(self, target): self.target = target
     def hasContent_(self):
         if (
 
@@ -4533,48 +4190,29 @@ class ExplicitInput(GeneratedsSuper):
             eol_ = '\n'
         else:
             eol_ = ''
+        if self.original_tagname_ is not None:
+            name_ = self.original_tagname_
         showIndent(outfile, level, pretty_print)
         outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
         already_processed = set()
         self.exportAttributes(outfile, level, already_processed, namespace_, name_='ExplicitInput')
         if self.hasContent_():
             outfile.write('>%s' % (eol_, ))
-            self.exportChildren(outfile, level + 1, namespace_, name_, pretty_print=pretty_print)
+            self.exportChildren(outfile, level + 1, namespace_='', name_='ExplicitInput', pretty_print=pretty_print)
             outfile.write('</%s%s>%s' % (namespace_, name_, eol_))
         else:
             outfile.write('/>%s' % (eol_, ))
     def exportAttributes(self, outfile, level, already_processed, namespace_='', name_='ExplicitInput'):
-        if self.input is not None and 'input' not in already_processed:
-            already_processed.add('input')
-            outfile.write(' input=%s' % (self.gds_format_string(quote_attrib(self.input).encode(ExternalEncoding), input_name='input'), ))
-        if self.destination is not None and 'destination' not in already_processed:
-            already_processed.add('destination')
-            outfile.write(' destination=%s' % (self.gds_format_string(quote_attrib(self.destination).encode(ExternalEncoding), input_name='destination'), ))
         if self.target is not None and 'target' not in already_processed:
             already_processed.add('target')
-            outfile.write(' target=%s' % (self.gds_format_string(quote_attrib(self.target).encode(ExternalEncoding), input_name='target'), ))
-    def exportChildren(self, outfile, level, namespace_='', name_='ExplicitInput', fromsubclass_=False, pretty_print=True):
-        pass
-    def exportLiteral(self, outfile, level, name_='ExplicitInput'):
-        level += 1
-        already_processed = set()
-        self.exportLiteralAttributes(outfile, level, already_processed, name_)
-        if self.hasContent_():
-            self.exportLiteralChildren(outfile, level, name_)
-    def exportLiteralAttributes(self, outfile, level, already_processed, name_):
+            outfile.write(' target=%s' % (self.gds_encode(self.gds_format_string(quote_attrib(self.target), input_name='target')), ))
         if self.input is not None and 'input' not in already_processed:
             already_processed.add('input')
-            showIndent(outfile, level)
-            outfile.write('input="%s",\n' % (self.input,))
+            outfile.write(' input=%s' % (self.gds_encode(self.gds_format_string(quote_attrib(self.input), input_name='input')), ))
         if self.destination is not None and 'destination' not in already_processed:
             already_processed.add('destination')
-            showIndent(outfile, level)
-            outfile.write('destination="%s",\n' % (self.destination,))
-        if self.target is not None and 'target' not in already_processed:
-            already_processed.add('target')
-            showIndent(outfile, level)
-            outfile.write('target="%s",\n' % (self.target,))
-    def exportLiteralChildren(self, outfile, level, name_):
+            outfile.write(' destination=%s' % (self.gds_encode(self.gds_format_string(quote_attrib(self.destination), input_name='destination')), ))
+    def exportChildren(self, outfile, level, namespace_='', name_='ExplicitInput', fromsubclass_=False, pretty_print=True):
         pass
     def build(self, node):
         already_processed = set()
@@ -4582,7 +4220,12 @@ class ExplicitInput(GeneratedsSuper):
         for child in node:
             nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
             self.buildChildren(child, node, nodeName_)
+        return self
     def buildAttributes(self, node, attrs, already_processed):
+        value = find_attr_value_('target', node)
+        if value is not None and 'target' not in already_processed:
+            already_processed.add('target')
+            self.target = value
         value = find_attr_value_('input', node)
         if value is not None and 'input' not in already_processed:
             already_processed.add('input')
@@ -4591,10 +4234,6 @@ class ExplicitInput(GeneratedsSuper):
         if value is not None and 'destination' not in already_processed:
             already_processed.add('destination')
             self.destination = value
-        value = find_attr_value_('target', node)
-        if value is not None and 'target' not in already_processed:
-            already_processed.add('target')
-            self.target = value
     def buildChildren(self, child_, node, nodeName_, fromsubclass_=False):
         pass
 # end class ExplicitInput
@@ -4604,26 +4243,35 @@ class Input(GeneratedsSuper):
     """Subject to change as it gets tested with LEMS"""
     subclass = None
     superclass = None
-    def __init__(self, destination=None, id=None, target=None):
-        self.destination = _cast(None, destination)
+    def __init__(self, id=None, target=None, destination=None):
+        self.original_tagname_ = None
         self.id = _cast(int, id)
         self.target = _cast(None, target)
-        pass
+        self.destination = _cast(None, destination)
     def factory(*args_, **kwargs_):
+        if CurrentSubclassModule_ is not None:
+            subclass = getSubclassFromModule_(
+                CurrentSubclassModule_, Input)
+            if subclass is not None:
+                return subclass(*args_, **kwargs_)
         if Input.subclass:
             return Input.subclass(*args_, **kwargs_)
         else:
             return Input(*args_, **kwargs_)
     factory = staticmethod(factory)
-    def get_destination(self): return self.destination
-    def set_destination(self, destination): self.destination = destination
-    def validate_NmlId(self, value):
-        # Validate type NmlId, a restriction on xs:string.
-        pass
     def get_id(self): return self.id
     def set_id(self, id): self.id = id
     def get_target(self): return self.target
     def set_target(self, target): self.target = target
+    def get_destination(self): return self.destination
+    def set_destination(self, destination): self.destination = destination
+    def validate_NmlId(self, value):
+        # Validate type NmlId, a restriction on xs:string.
+        if value is not None and Validate_simpletypes_:
+            if not self.gds_validate_simple_patterns(
+                    self.validate_NmlId_patterns_, value):
+                warnings_.warn('Value "%s" does not match xsd pattern restrictions: %s' % (value.encode('utf-8'), self.validate_NmlId_patterns_, ))
+    validate_NmlId_patterns_ = [['^[a-zA-Z0-9_]*$']]
     def hasContent_(self):
         if (
 
@@ -4636,61 +4284,38 @@ class Input(GeneratedsSuper):
             eol_ = '\n'
         else:
             eol_ = ''
+        if self.original_tagname_ is not None:
+            name_ = self.original_tagname_
         showIndent(outfile, level, pretty_print)
         outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
         already_processed = set()
         self.exportAttributes(outfile, level, already_processed, namespace_, name_='Input')
         if self.hasContent_():
             outfile.write('>%s' % (eol_, ))
-            self.exportChildren(outfile, level + 1, namespace_, name_, pretty_print=pretty_print)
+            self.exportChildren(outfile, level + 1, namespace_='', name_='Input', pretty_print=pretty_print)
             outfile.write('</%s%s>%s' % (namespace_, name_, eol_))
         else:
             outfile.write('/>%s' % (eol_, ))
     def exportAttributes(self, outfile, level, already_processed, namespace_='', name_='Input'):
-        if self.destination is not None and 'destination' not in already_processed:
-            already_processed.add('destination')
-            outfile.write(' destination=%s' % (quote_attrib(self.destination), ))
         if self.id is not None and 'id' not in already_processed:
             already_processed.add('id')
             outfile.write(' id="%s"' % self.gds_format_integer(self.id, input_name='id'))
         if self.target is not None and 'target' not in already_processed:
             already_processed.add('target')
-            outfile.write(' target=%s' % (self.gds_format_string(quote_attrib(self.target).encode(ExternalEncoding), input_name='target'), ))
+            outfile.write(' target=%s' % (self.gds_encode(self.gds_format_string(quote_attrib(self.target), input_name='target')), ))
+        if self.destination is not None and 'destination' not in already_processed:
+            already_processed.add('destination')
+            outfile.write(' destination=%s' % (quote_attrib(self.destination), ))
     def exportChildren(self, outfile, level, namespace_='', name_='Input', fromsubclass_=False, pretty_print=True):
         pass
-    def exportLiteral(self, outfile, level, name_='Input'):
-        level += 1
-        already_processed = set()
-        self.exportLiteralAttributes(outfile, level, already_processed, name_)
-        if self.hasContent_():
-            self.exportLiteralChildren(outfile, level, name_)
-    def exportLiteralAttributes(self, outfile, level, already_processed, name_):
-        if self.destination is not None and 'destination' not in already_processed:
-            already_processed.add('destination')
-            showIndent(outfile, level)
-            outfile.write('destination="%s",\n' % (self.destination,))
-        if self.id is not None and 'id' not in already_processed:
-            already_processed.add('id')
-            showIndent(outfile, level)
-            outfile.write('id=%d,\n' % (self.id,))
-        if self.target is not None and 'target' not in already_processed:
-            already_processed.add('target')
-            showIndent(outfile, level)
-            outfile.write('target="%s",\n' % (self.target,))
-    def exportLiteralChildren(self, outfile, level, name_):
-        pass
-    def build(self, node):
+    def build(self, node):
         already_processed = set()
         self.buildAttributes(node, node.attrib, already_processed)
         for child in node:
             nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
             self.buildChildren(child, node, nodeName_)
+        return self
     def buildAttributes(self, node, attrs, already_processed):
-        value = find_attr_value_('destination', node)
-        if value is not None and 'destination' not in already_processed:
-            already_processed.add('destination')
-            self.destination = value
-            self.validate_NmlId(self.destination)    # validate type NmlId
         value = find_attr_value_('id', node)
         if value is not None and 'id' not in already_processed:
             already_processed.add('id')
@@ -4704,6 +4329,11 @@ class Input(GeneratedsSuper):
         if value is not None and 'target' not in already_processed:
             already_processed.add('target')
             self.target = value
+        value = find_attr_value_('destination', node)
+        if value is not None and 'destination' not in already_processed:
+            already_processed.add('destination')
+            self.destination = value
+            self.validate_NmlId(self.destination)    # validate type NmlId
     def buildChildren(self, child_, node, nodeName_, fromsubclass_=False):
         pass
 # end class Input
@@ -4715,10 +4345,16 @@ class Base(GeneratedsSuper):
     subclass = None
     superclass = None
     def __init__(self, id=None, neuroLexId=None, extensiontype_=None):
+        self.original_tagname_ = None
         self.id = _cast(None, id)
         self.neuroLexId = _cast(None, neuroLexId)
         self.extensiontype_ = extensiontype_
     def factory(*args_, **kwargs_):
+        if CurrentSubclassModule_ is not None:
+            subclass = getSubclassFromModule_(
+                CurrentSubclassModule_, Base)
+            if subclass is not None:
+                return subclass(*args_, **kwargs_)
         if Base.subclass:
             return Base.subclass(*args_, **kwargs_)
         else:
@@ -4726,16 +4362,24 @@ class Base(GeneratedsSuper):
     factory = staticmethod(factory)
     def get_id(self): return self.id
     def set_id(self, id): self.id = id
-    def validate_NmlId(self, value):
-        # Validate type NmlId, a restriction on xs:string.
-        pass
     def get_neuroLexId(self): return self.neuroLexId
     def set_neuroLexId(self, neuroLexId): self.neuroLexId = neuroLexId
-    def validate_NeuroLexId(self, value):
-        # Validate type NeuroLexId, a restriction on xs:string.
-        pass
     def get_extensiontype_(self): return self.extensiontype_
     def set_extensiontype_(self, extensiontype_): self.extensiontype_ = extensiontype_
+    def validate_NmlId(self, value):
+        # Validate type NmlId, a restriction on xs:string.
+        if value is not None and Validate_simpletypes_:
+            if not self.gds_validate_simple_patterns(
+                    self.validate_NmlId_patterns_, value):
+                warnings_.warn('Value "%s" does not match xsd pattern restrictions: %s' % (value.encode('utf-8'), self.validate_NmlId_patterns_, ))
+    validate_NmlId_patterns_ = [['^[a-zA-Z0-9_]*$']]
+    def validate_NeuroLexId(self, value):
+        # Validate type NeuroLexId, a restriction on xs:string.
+        if value is not None and Validate_simpletypes_:
+            if not self.gds_validate_simple_patterns(
+                    self.validate_NeuroLexId_patterns_, value):
+                warnings_.warn('Value "%s" does not match xsd pattern restrictions: %s' % (value.encode('utf-8'), self.validate_NeuroLexId_patterns_, ))
+    validate_NeuroLexId_patterns_ = [['^[a-zA-Z0-9_]*$']]
     def hasContent_(self):
         if (
 
@@ -4748,13 +4392,15 @@ class Base(GeneratedsSuper):
             eol_ = '\n'
         else:
             eol_ = ''
+        if self.original_tagname_ is not None:
+            name_ = self.original_tagname_
         showIndent(outfile, level, pretty_print)
         outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
         already_processed = set()
         self.exportAttributes(outfile, level, already_processed, namespace_, name_='Base')
         if self.hasContent_():
             outfile.write('>%s' % (eol_, ))
-            self.exportChildren(outfile, level + 1, namespace_, name_, pretty_print=pretty_print)
+            self.exportChildren(outfile, level + 1, namespace_='', name_='Base', pretty_print=pretty_print)
             outfile.write('</%s%s>%s' % (namespace_, name_, eol_))
         else:
             outfile.write('/>%s' % (eol_, ))
@@ -4771,29 +4417,13 @@ class Base(GeneratedsSuper):
             outfile.write(' xsi:type="%s"' % self.extensiontype_)
     def exportChildren(self, outfile, level, namespace_='', name_='Base', fromsubclass_=False, pretty_print=True):
         pass
-    def exportLiteral(self, outfile, level, name_='Base'):
-        level += 1
-        already_processed = set()
-        self.exportLiteralAttributes(outfile, level, already_processed, name_)
-        if self.hasContent_():
-            self.exportLiteralChildren(outfile, level, name_)
-    def exportLiteralAttributes(self, outfile, level, already_processed, name_):
-        if self.id is not None and 'id' not in already_processed:
-            already_processed.add('id')
-            showIndent(outfile, level)
-            outfile.write('id="%s",\n' % (self.id,))
-        if self.neuroLexId is not None and 'neuroLexId' not in already_processed:
-            already_processed.add('neuroLexId')
-            showIndent(outfile, level)
-            outfile.write('neuroLexId="%s",\n' % (self.neuroLexId,))
-    def exportLiteralChildren(self, outfile, level, name_):
-        pass
     def build(self, node):
         already_processed = set()
         self.buildAttributes(node, node.attrib, already_processed)
         for child in node:
             nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
             self.buildChildren(child, node, nodeName_)
+        return self
     def buildAttributes(self, node, attrs, already_processed):
         value = find_attr_value_('id', node)
         if value is not None and 'id' not in already_processed:
@@ -4821,13 +4451,20 @@ class Standalone(Base):
     subclass = None
     superclass = Base
     def __init__(self, id=None, neuroLexId=None, name=None, metaid=None, notes=None, annotation=None, extensiontype_=None):
+        self.original_tagname_ = None
         super(Standalone, self).__init__(id, neuroLexId, extensiontype_, )
         self.name = _cast(None, name)
         self.metaid = _cast(None, metaid)
         self.notes = notes
+        self.validate_Notes(self.notes)
         self.annotation = annotation
         self.extensiontype_ = extensiontype_
     def factory(*args_, **kwargs_):
+        if CurrentSubclassModule_ is not None:
+            subclass = getSubclassFromModule_(
+                CurrentSubclassModule_, Standalone)
+            if subclass is not None:
+                return subclass(*args_, **kwargs_)
         if Standalone.subclass:
             return Standalone.subclass(*args_, **kwargs_)
         else:
@@ -4835,20 +4472,25 @@ class Standalone(Base):
     factory = staticmethod(factory)
     def get_notes(self): return self.notes
     def set_notes(self, notes): self.notes = notes
-    def validate_Notes(self, value):
-        # Validate type Notes, a restriction on xs:string.
-        pass
     def get_annotation(self): return self.annotation
     def set_annotation(self, annotation): self.annotation = annotation
     def get_name(self): return self.name
     def set_name(self, name): self.name = name
     def get_metaid(self): return self.metaid
     def set_metaid(self, metaid): self.metaid = metaid
-    def validate_MetaId(self, value):
-        # Validate type MetaId, a restriction on xs:string.
-        pass
     def get_extensiontype_(self): return self.extensiontype_
     def set_extensiontype_(self, extensiontype_): self.extensiontype_ = extensiontype_
+    def validate_Notes(self, value):
+        # Validate type Notes, a restriction on xs:string.
+        if value is not None and Validate_simpletypes_:
+            pass
+    def validate_MetaId(self, value):
+        # Validate type MetaId, a restriction on xs:string.
+        if value is not None and Validate_simpletypes_:
+            if not self.gds_validate_simple_patterns(
+                    self.validate_MetaId_patterns_, value):
+                warnings_.warn('Value "%s" does not match xsd pattern restrictions: %s' % (value.encode('utf-8'), self.validate_MetaId_patterns_, ))
+    validate_MetaId_patterns_ = [['^[a-zA-Z0-9_]*$']]
     def hasContent_(self):
         if (
             self.notes is not None or
@@ -4863,13 +4505,15 @@ class Standalone(Base):
             eol_ = '\n'
         else:
             eol_ = ''
+        if self.original_tagname_ is not None:
+            name_ = self.original_tagname_
         showIndent(outfile, level, pretty_print)
         outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
         already_processed = set()
         self.exportAttributes(outfile, level, already_processed, namespace_, name_='Standalone')
         if self.hasContent_():
             outfile.write('>%s' % (eol_, ))
-            self.exportChildren(outfile, level + 1, namespace_, name_, pretty_print=pretty_print)
+            self.exportChildren(outfile, level + 1, namespace_='', name_='Standalone', pretty_print=pretty_print)
             showIndent(outfile, level, pretty_print)
             outfile.write('</%s%s>%s' % (namespace_, name_, eol_))
         else:
@@ -4878,7 +4522,7 @@ class Standalone(Base):
         super(Standalone, self).exportAttributes(outfile, level, already_processed, namespace_, name_='Standalone')
         if self.name is not None and 'name' not in already_processed:
             already_processed.add('name')
-            outfile.write(' name=%s' % (self.gds_format_string(quote_attrib(self.name).encode(ExternalEncoding), input_name='name'), ))
+            outfile.write(' name=%s' % (self.gds_encode(self.gds_format_string(quote_attrib(self.name), input_name='name')), ))
         if self.metaid is not None and 'metaid' not in already_processed:
             already_processed.add('metaid')
             outfile.write(' metaid=%s' % (quote_attrib(self.metaid), ))
@@ -4894,42 +4538,16 @@ class Standalone(Base):
             eol_ = ''
         if self.notes is not None:
             showIndent(outfile, level, pretty_print)
-            outfile.write('<%snotes>%s</%snotes>%s' % (namespace_, self.gds_format_string(quote_xml(self.notes).encode(ExternalEncoding), input_name='notes'), namespace_, eol_))
+            outfile.write('<%snotes>%s</%snotes>%s' % (namespace_, self.gds_encode(self.gds_format_string(quote_xml(self.notes), input_name='notes')), namespace_, eol_))
         if self.annotation is not None:
             self.annotation.export(outfile, level, namespace_, name_='annotation', pretty_print=pretty_print)
-    def exportLiteral(self, outfile, level, name_='Standalone'):
-        level += 1
-        already_processed = set()
-        self.exportLiteralAttributes(outfile, level, already_processed, name_)
-        if self.hasContent_():
-            self.exportLiteralChildren(outfile, level, name_)
-    def exportLiteralAttributes(self, outfile, level, already_processed, name_):
-        if self.name is not None and 'name' not in already_processed:
-            already_processed.add('name')
-            showIndent(outfile, level)
-            outfile.write('name="%s",\n' % (self.name,))
-        if self.metaid is not None and 'metaid' not in already_processed:
-            already_processed.add('metaid')
-            showIndent(outfile, level)
-            outfile.write('metaid="%s",\n' % (self.metaid,))
-        super(Standalone, self).exportLiteralAttributes(outfile, level, already_processed, name_)
-    def exportLiteralChildren(self, outfile, level, name_):
-        super(Standalone, self).exportLiteralChildren(outfile, level, name_)
-        if self.notes is not None:
-            showIndent(outfile, level)
-            outfile.write('notes=%s,\n' % quote_python(self.notes).encode(ExternalEncoding))
-        if self.annotation is not None:
-            showIndent(outfile, level)
-            outfile.write('annotation=model_.Annotation(\n')
-            self.annotation.exportLiteral(outfile, level, name_='annotation')
-            showIndent(outfile, level)
-            outfile.write('),\n')
     def build(self, node):
         already_processed = set()
         self.buildAttributes(node, node.attrib, already_processed)
         for child in node:
             nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
             self.buildChildren(child, node, nodeName_)
+        return self
     def buildAttributes(self, node, attrs, already_processed):
         value = find_attr_value_('name', node)
         if value is not None and 'name' not in already_processed:
@@ -4950,11 +4568,13 @@ class Standalone(Base):
             notes_ = child_.text
             notes_ = self.gds_validate_string(notes_, node, 'notes')
             self.notes = notes_
-            self.validate_Notes(self.notes)    # validate type Notes
+            # validate type Notes
+            self.validate_Notes(self.notes)
         elif nodeName_ == 'annotation':
             obj_ = Annotation.factory()
             obj_.build(child_)
-            self.set_annotation(obj_)
+            self.annotation = obj_
+            obj_.original_tagname_ = 'annotation'
         super(Standalone, self).buildChildren(child_, node, nodeName_, True)
 # end class Standalone
 
@@ -4962,30 +4582,43 @@ class Standalone(Base):
 class SpikeSourcePoisson(Standalone):
     subclass = None
     superclass = Standalone
-    def __init__(self, id=None, neuroLexId=None, name=None, metaid=None, notes=None, annotation=None, duration=None, start=None, rate=None):
+    def __init__(self, id=None, neuroLexId=None, name=None, metaid=None, notes=None, annotation=None, start=None, duration=None, rate=None):
+        self.original_tagname_ = None
         super(SpikeSourcePoisson, self).__init__(id, neuroLexId, name, metaid, notes, annotation, )
-        self.duration = _cast(None, duration)
         self.start = _cast(None, start)
+        self.duration = _cast(None, duration)
         self.rate = _cast(None, rate)
-        pass
     def factory(*args_, **kwargs_):
+        if CurrentSubclassModule_ is not None:
+            subclass = getSubclassFromModule_(
+                CurrentSubclassModule_, SpikeSourcePoisson)
+            if subclass is not None:
+                return subclass(*args_, **kwargs_)
         if SpikeSourcePoisson.subclass:
             return SpikeSourcePoisson.subclass(*args_, **kwargs_)
         else:
             return SpikeSourcePoisson(*args_, **kwargs_)
     factory = staticmethod(factory)
-    def get_duration(self): return self.duration
-    def set_duration(self, duration): self.duration = duration
-    def validate_Nml2Quantity_time(self, value):
-        # Validate type Nml2Quantity_time, a restriction on xs:string.
-        pass
     def get_start(self): return self.start
     def set_start(self, start): self.start = start
+    def get_duration(self): return self.duration
+    def set_duration(self, duration): self.duration = duration
     def get_rate(self): return self.rate
     def set_rate(self, rate): self.rate = rate
+    def validate_Nml2Quantity_time(self, value):
+        # Validate type Nml2Quantity_time, a restriction on xs:string.
+        if value is not None and Validate_simpletypes_:
+            if not self.gds_validate_simple_patterns(
+                    self.validate_Nml2Quantity_time_patterns_, value):
+                warnings_.warn('Value "%s" does not match xsd pattern restrictions: %s' % (value.encode('utf-8'), self.validate_Nml2Quantity_time_patterns_, ))
+    validate_Nml2Quantity_time_patterns_ = [['^-?([0-9]*(\\.[0-9]+)?)([eE]-?[0-9]+)?[\\s]*(s|ms)$']]
     def validate_Nml2Quantity_pertime(self, value):
         # Validate type Nml2Quantity_pertime, a restriction on xs:string.
-        pass
+        if value is not None and Validate_simpletypes_:
+            if not self.gds_validate_simple_patterns(
+                    self.validate_Nml2Quantity_pertime_patterns_, value):
+                warnings_.warn('Value "%s" does not match xsd pattern restrictions: %s' % (value.encode('utf-8'), self.validate_Nml2Quantity_pertime_patterns_, ))
+    validate_Nml2Quantity_pertime_patterns_ = [['^-?([0-9]*(\\.[0-9]+)?)([eE]-?[0-9]+)?[\\s]*(per_s|per_ms|Hz)$']]
     def hasContent_(self):
         if (
             super(SpikeSourcePoisson, self).hasContent_()
@@ -4998,69 +4631,50 @@ class SpikeSourcePoisson(Standalone):
             eol_ = '\n'
         else:
             eol_ = ''
+        if self.original_tagname_ is not None:
+            name_ = self.original_tagname_
         showIndent(outfile, level, pretty_print)
         outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
         already_processed = set()
         self.exportAttributes(outfile, level, already_processed, namespace_, name_='SpikeSourcePoisson')
         if self.hasContent_():
             outfile.write('>%s' % (eol_, ))
-            self.exportChildren(outfile, level + 1, namespace_, name_, pretty_print=pretty_print)
+            self.exportChildren(outfile, level + 1, namespace_='', name_='SpikeSourcePoisson', pretty_print=pretty_print)
             showIndent(outfile, level, pretty_print)
             outfile.write('</%s%s>%s' % (namespace_, name_, eol_))
         else:
             outfile.write('/>%s' % (eol_, ))
     def exportAttributes(self, outfile, level, already_processed, namespace_='', name_='SpikeSourcePoisson'):
         super(SpikeSourcePoisson, self).exportAttributes(outfile, level, already_processed, namespace_, name_='SpikeSourcePoisson')
-        if self.duration is not None and 'duration' not in already_processed:
-            already_processed.add('duration')
-            outfile.write(' duration=%s' % (quote_attrib(self.duration), ))
         if self.start is not None and 'start' not in already_processed:
             already_processed.add('start')
             outfile.write(' start=%s' % (quote_attrib(self.start), ))
+        if self.duration is not None and 'duration' not in already_processed:
+            already_processed.add('duration')
+            outfile.write(' duration=%s' % (quote_attrib(self.duration), ))
         if self.rate is not None and 'rate' not in already_processed:
             already_processed.add('rate')
             outfile.write(' rate=%s' % (quote_attrib(self.rate), ))
     def exportChildren(self, outfile, level, namespace_='', name_='SpikeSourcePoisson', fromsubclass_=False, pretty_print=True):
         super(SpikeSourcePoisson, self).exportChildren(outfile, level, namespace_, name_, True, pretty_print=pretty_print)
-    def exportLiteral(self, outfile, level, name_='SpikeSourcePoisson'):
-        level += 1
-        already_processed = set()
-        self.exportLiteralAttributes(outfile, level, already_processed, name_)
-        if self.hasContent_():
-            self.exportLiteralChildren(outfile, level, name_)
-    def exportLiteralAttributes(self, outfile, level, already_processed, name_):
-        if self.duration is not None and 'duration' not in already_processed:
-            already_processed.add('duration')
-            showIndent(outfile, level)
-            outfile.write('duration="%s",\n' % (self.duration,))
-        if self.start is not None and 'start' not in already_processed:
-            already_processed.add('start')
-            showIndent(outfile, level)
-            outfile.write('start="%s",\n' % (self.start,))
-        if self.rate is not None and 'rate' not in already_processed:
-            already_processed.add('rate')
-            showIndent(outfile, level)
-            outfile.write('rate="%s",\n' % (self.rate,))
-        super(SpikeSourcePoisson, self).exportLiteralAttributes(outfile, level, already_processed, name_)
-    def exportLiteralChildren(self, outfile, level, name_):
-        super(SpikeSourcePoisson, self).exportLiteralChildren(outfile, level, name_)
     def build(self, node):
         already_processed = set()
         self.buildAttributes(node, node.attrib, already_processed)
         for child in node:
             nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
             self.buildChildren(child, node, nodeName_)
+        return self
     def buildAttributes(self, node, attrs, already_processed):
-        value = find_attr_value_('duration', node)
-        if value is not None and 'duration' not in already_processed:
-            already_processed.add('duration')
-            self.duration = value
-            self.validate_Nml2Quantity_time(self.duration)    # validate type Nml2Quantity_time
         value = find_attr_value_('start', node)
         if value is not None and 'start' not in already_processed:
             already_processed.add('start')
             self.start = value
             self.validate_Nml2Quantity_time(self.start)    # validate type Nml2Quantity_time
+        value = find_attr_value_('duration', node)
+        if value is not None and 'duration' not in already_processed:
+            already_processed.add('duration')
+            self.duration = value
+            self.validate_Nml2Quantity_time(self.duration)    # validate type Nml2Quantity_time
         value = find_attr_value_('rate', node)
         if value is not None and 'rate' not in already_processed:
             already_processed.add('rate')
@@ -5077,15 +4691,21 @@ class InputList(Base):
     """Subject to change as it gets tested with LEMS"""
     subclass = None
     superclass = Base
-    def __init__(self, id=None, neuroLexId=None, component=None, population=None, input=None):
+    def __init__(self, id=None, neuroLexId=None, population=None, component=None, input=None):
+        self.original_tagname_ = None
         super(InputList, self).__init__(id, neuroLexId, )
-        self.component = _cast(None, component)
         self.population = _cast(None, population)
+        self.component = _cast(None, component)
         if input is None:
             self.input = []
         else:
             self.input = input
     def factory(*args_, **kwargs_):
+        if CurrentSubclassModule_ is not None:
+            subclass = getSubclassFromModule_(
+                CurrentSubclassModule_, InputList)
+            if subclass is not None:
+                return subclass(*args_, **kwargs_)
         if InputList.subclass:
             return InputList.subclass(*args_, **kwargs_)
         else:
@@ -5094,14 +4714,19 @@ class InputList(Base):
     def get_input(self): return self.input
     def set_input(self, input): self.input = input
     def add_input(self, value): self.input.append(value)
-    def insert_input(self, index, value): self.input[index] = value
+    def insert_input_at(self, index, value): self.input.insert(index, value)
+    def replace_input_at(self, index, value): self.input[index] = value
+    def get_population(self): return self.population
+    def set_population(self, population): self.population = population
     def get_component(self): return self.component
     def set_component(self, component): self.component = component
     def validate_NmlId(self, value):
         # Validate type NmlId, a restriction on xs:string.
-        pass
-    def get_population(self): return self.population
-    def set_population(self, population): self.population = population
+        if value is not None and Validate_simpletypes_:
+            if not self.gds_validate_simple_patterns(
+                    self.validate_NmlId_patterns_, value):
+                warnings_.warn('Value "%s" does not match xsd pattern restrictions: %s' % (value.encode('utf-8'), self.validate_NmlId_patterns_, ))
+    validate_NmlId_patterns_ = [['^[a-zA-Z0-9_]*$']]
     def hasContent_(self):
         if (
             self.input or
@@ -5115,25 +4740,27 @@ class InputList(Base):
             eol_ = '\n'
         else:
             eol_ = ''
+        if self.original_tagname_ is not None:
+            name_ = self.original_tagname_
         showIndent(outfile, level, pretty_print)
         outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
         already_processed = set()
         self.exportAttributes(outfile, level, already_processed, namespace_, name_='InputList')
         if self.hasContent_():
             outfile.write('>%s' % (eol_, ))
-            self.exportChildren(outfile, level + 1, namespace_, name_, pretty_print=pretty_print)
+            self.exportChildren(outfile, level + 1, namespace_='', name_='InputList', pretty_print=pretty_print)
             showIndent(outfile, level, pretty_print)
             outfile.write('</%s%s>%s' % (namespace_, name_, eol_))
         else:
             outfile.write('/>%s' % (eol_, ))
     def exportAttributes(self, outfile, level, already_processed, namespace_='', name_='InputList'):
         super(InputList, self).exportAttributes(outfile, level, already_processed, namespace_, name_='InputList')
-        if self.component is not None and 'component' not in already_processed:
-            already_processed.add('component')
-            outfile.write(' component=%s' % (quote_attrib(self.component), ))
         if self.population is not None and 'population' not in already_processed:
             already_processed.add('population')
             outfile.write(' population=%s' % (quote_attrib(self.population), ))
+        if self.component is not None and 'component' not in already_processed:
+            already_processed.add('component')
+            outfile.write(' component=%s' % (quote_attrib(self.component), ))
     def exportChildren(self, outfile, level, namespace_='', name_='InputList', fromsubclass_=False, pretty_print=True):
         super(InputList, self).exportChildren(outfile, level, namespace_, name_, True, pretty_print=pretty_print)
         if pretty_print:
@@ -5142,59 +4769,31 @@ class InputList(Base):
             eol_ = ''
         for input_ in self.input:
             input_.export(outfile, level, namespace_, name_='input', pretty_print=pretty_print)
-    def exportLiteral(self, outfile, level, name_='InputList'):
-        level += 1
-        already_processed = set()
-        self.exportLiteralAttributes(outfile, level, already_processed, name_)
-        if self.hasContent_():
-            self.exportLiteralChildren(outfile, level, name_)
-    def exportLiteralAttributes(self, outfile, level, already_processed, name_):
-        if self.component is not None and 'component' not in already_processed:
-            already_processed.add('component')
-            showIndent(outfile, level)
-            outfile.write('component="%s",\n' % (self.component,))
-        if self.population is not None and 'population' not in already_processed:
-            already_processed.add('population')
-            showIndent(outfile, level)
-            outfile.write('population="%s",\n' % (self.population,))
-        super(InputList, self).exportLiteralAttributes(outfile, level, already_processed, name_)
-    def exportLiteralChildren(self, outfile, level, name_):
-        super(InputList, self).exportLiteralChildren(outfile, level, name_)
-        showIndent(outfile, level)
-        outfile.write('input=[\n')
-        level += 1
-        for input_ in self.input:
-            showIndent(outfile, level)
-            outfile.write('model_.Input(\n')
-            input_.exportLiteral(outfile, level, name_='Input')
-            showIndent(outfile, level)
-            outfile.write('),\n')
-        level -= 1
-        showIndent(outfile, level)
-        outfile.write('],\n')
     def build(self, node):
         already_processed = set()
         self.buildAttributes(node, node.attrib, already_processed)
         for child in node:
             nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
             self.buildChildren(child, node, nodeName_)
+        return self
     def buildAttributes(self, node, attrs, already_processed):
-        value = find_attr_value_('component', node)
-        if value is not None and 'component' not in already_processed:
-            already_processed.add('component')
-            self.component = value
-            self.validate_NmlId(self.component)    # validate type NmlId
         value = find_attr_value_('population', node)
         if value is not None and 'population' not in already_processed:
             already_processed.add('population')
             self.population = value
             self.validate_NmlId(self.population)    # validate type NmlId
+        value = find_attr_value_('component', node)
+        if value is not None and 'component' not in already_processed:
+            already_processed.add('component')
+            self.component = value
+            self.validate_NmlId(self.component)    # validate type NmlId
         super(InputList, self).buildAttributes(node, attrs, already_processed)
     def buildChildren(self, child_, node, nodeName_, fromsubclass_=False):
         if nodeName_ == 'input':
             obj_ = Input.factory()
             obj_.build(child_)
             self.input.append(obj_)
+            obj_.original_tagname_ = 'input'
         super(InputList, self).buildChildren(child_, node, nodeName_, True)
 # end class InputList
 
@@ -5203,16 +4802,22 @@ class Projection(Base):
     """Subject to change as it gets tested with LEMS"""
     subclass = None
     superclass = Base
-    def __init__(self, id=None, neuroLexId=None, postsynapticPopulation=None, presynapticPopulation=None, synapse=None, connection=None):
+    def __init__(self, id=None, neuroLexId=None, presynapticPopulation=None, postsynapticPopulation=None, synapse=None, connection=None):
+        self.original_tagname_ = None
         super(Projection, self).__init__(id, neuroLexId, )
-        self.postsynapticPopulation = _cast(None, postsynapticPopulation)
         self.presynapticPopulation = _cast(None, presynapticPopulation)
+        self.postsynapticPopulation = _cast(None, postsynapticPopulation)
         self.synapse = _cast(None, synapse)
         if connection is None:
             self.connection = []
         else:
             self.connection = connection
     def factory(*args_, **kwargs_):
+        if CurrentSubclassModule_ is not None:
+            subclass = getSubclassFromModule_(
+                CurrentSubclassModule_, Projection)
+            if subclass is not None:
+                return subclass(*args_, **kwargs_)
         if Projection.subclass:
             return Projection.subclass(*args_, **kwargs_)
         else:
@@ -5221,16 +4826,21 @@ class Projection(Base):
     def get_connection(self): return self.connection
     def set_connection(self, connection): self.connection = connection
     def add_connection(self, value): self.connection.append(value)
-    def insert_connection(self, index, value): self.connection[index] = value
-    def get_postsynapticPopulation(self): return self.postsynapticPopulation
-    def set_postsynapticPopulation(self, postsynapticPopulation): self.postsynapticPopulation = postsynapticPopulation
-    def validate_NmlId(self, value):
-        # Validate type NmlId, a restriction on xs:string.
-        pass
+    def insert_connection_at(self, index, value): self.connection.insert(index, value)
+    def replace_connection_at(self, index, value): self.connection[index] = value
     def get_presynapticPopulation(self): return self.presynapticPopulation
     def set_presynapticPopulation(self, presynapticPopulation): self.presynapticPopulation = presynapticPopulation
+    def get_postsynapticPopulation(self): return self.postsynapticPopulation
+    def set_postsynapticPopulation(self, postsynapticPopulation): self.postsynapticPopulation = postsynapticPopulation
     def get_synapse(self): return self.synapse
     def set_synapse(self, synapse): self.synapse = synapse
+    def validate_NmlId(self, value):
+        # Validate type NmlId, a restriction on xs:string.
+        if value is not None and Validate_simpletypes_:
+            if not self.gds_validate_simple_patterns(
+                    self.validate_NmlId_patterns_, value):
+                warnings_.warn('Value "%s" does not match xsd pattern restrictions: %s' % (value.encode('utf-8'), self.validate_NmlId_patterns_, ))
+    validate_NmlId_patterns_ = [['^[a-zA-Z0-9_]*$']]
     def hasContent_(self):
         if (
             self.connection or
@@ -5244,25 +4854,27 @@ class Projection(Base):
             eol_ = '\n'
         else:
             eol_ = ''
+        if self.original_tagname_ is not None:
+            name_ = self.original_tagname_
         showIndent(outfile, level, pretty_print)
         outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
         already_processed = set()
         self.exportAttributes(outfile, level, already_processed, namespace_, name_='Projection')
         if self.hasContent_():
             outfile.write('>%s' % (eol_, ))
-            self.exportChildren(outfile, level + 1, namespace_, name_, pretty_print=pretty_print)
+            self.exportChildren(outfile, level + 1, namespace_='', name_='Projection', pretty_print=pretty_print)
             showIndent(outfile, level, pretty_print)
             outfile.write('</%s%s>%s' % (namespace_, name_, eol_))
         else:
             outfile.write('/>%s' % (eol_, ))
     def exportAttributes(self, outfile, level, already_processed, namespace_='', name_='Projection'):
         super(Projection, self).exportAttributes(outfile, level, already_processed, namespace_, name_='Projection')
-        if self.postsynapticPopulation is not None and 'postsynapticPopulation' not in already_processed:
-            already_processed.add('postsynapticPopulation')
-            outfile.write(' postsynapticPopulation=%s' % (quote_attrib(self.postsynapticPopulation), ))
         if self.presynapticPopulation is not None and 'presynapticPopulation' not in already_processed:
             already_processed.add('presynapticPopulation')
             outfile.write(' presynapticPopulation=%s' % (quote_attrib(self.presynapticPopulation), ))
+        if self.postsynapticPopulation is not None and 'postsynapticPopulation' not in already_processed:
+            already_processed.add('postsynapticPopulation')
+            outfile.write(' postsynapticPopulation=%s' % (quote_attrib(self.postsynapticPopulation), ))
         if self.synapse is not None and 'synapse' not in already_processed:
             already_processed.add('synapse')
             outfile.write(' synapse=%s' % (quote_attrib(self.synapse), ))
@@ -5274,57 +4886,24 @@ class Projection(Base):
             eol_ = ''
         for connection_ in self.connection:
             connection_.export(outfile, level, namespace_, name_='connection', pretty_print=pretty_print)
-    def exportLiteral(self, outfile, level, name_='Projection'):
-        level += 1
-        already_processed = set()
-        self.exportLiteralAttributes(outfile, level, already_processed, name_)
-        if self.hasContent_():
-            self.exportLiteralChildren(outfile, level, name_)
-    def exportLiteralAttributes(self, outfile, level, already_processed, name_):
-        if self.postsynapticPopulation is not None and 'postsynapticPopulation' not in already_processed:
-            already_processed.add('postsynapticPopulation')
-            showIndent(outfile, level)
-            outfile.write('postsynapticPopulation="%s",\n' % (self.postsynapticPopulation,))
-        if self.presynapticPopulation is not None and 'presynapticPopulation' not in already_processed:
-            already_processed.add('presynapticPopulation')
-            showIndent(outfile, level)
-            outfile.write('presynapticPopulation="%s",\n' % (self.presynapticPopulation,))
-        if self.synapse is not None and 'synapse' not in already_processed:
-            already_processed.add('synapse')
-            showIndent(outfile, level)
-            outfile.write('synapse="%s",\n' % (self.synapse,))
-        super(Projection, self).exportLiteralAttributes(outfile, level, already_processed, name_)
-    def exportLiteralChildren(self, outfile, level, name_):
-        super(Projection, self).exportLiteralChildren(outfile, level, name_)
-        showIndent(outfile, level)
-        outfile.write('connection=[\n')
-        level += 1
-        for connection_ in self.connection:
-            showIndent(outfile, level)
-            outfile.write('model_.Connection(\n')
-            connection_.exportLiteral(outfile, level, name_='Connection')
-            showIndent(outfile, level)
-            outfile.write('),\n')
-        level -= 1
-        showIndent(outfile, level)
-        outfile.write('],\n')
     def build(self, node):
         already_processed = set()
         self.buildAttributes(node, node.attrib, already_processed)
         for child in node:
             nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
             self.buildChildren(child, node, nodeName_)
+        return self
     def buildAttributes(self, node, attrs, already_processed):
-        value = find_attr_value_('postsynapticPopulation', node)
-        if value is not None and 'postsynapticPopulation' not in already_processed:
-            already_processed.add('postsynapticPopulation')
-            self.postsynapticPopulation = value
-            self.validate_NmlId(self.postsynapticPopulation)    # validate type NmlId
         value = find_attr_value_('presynapticPopulation', node)
         if value is not None and 'presynapticPopulation' not in already_processed:
             already_processed.add('presynapticPopulation')
             self.presynapticPopulation = value
             self.validate_NmlId(self.presynapticPopulation)    # validate type NmlId
+        value = find_attr_value_('postsynapticPopulation', node)
+        if value is not None and 'postsynapticPopulation' not in already_processed:
+            already_processed.add('postsynapticPopulation')
+            self.postsynapticPopulation = value
+            self.validate_NmlId(self.postsynapticPopulation)    # validate type NmlId
         value = find_attr_value_('synapse', node)
         if value is not None and 'synapse' not in already_processed:
             already_processed.add('synapse')
@@ -5336,6 +4915,7 @@ class Projection(Base):
             obj_ = Connection.factory()
             obj_.build(child_)
             self.connection.append(obj_)
+            obj_.original_tagname_ = 'connection'
         super(Projection, self).buildChildren(child_, node, nodeName_, True)
 # end class Projection
 
@@ -5344,6 +4924,7 @@ class CellSet(Base):
     subclass = None
     superclass = Base
     def __init__(self, id=None, neuroLexId=None, select=None, anytypeobjs_=None):
+        self.original_tagname_ = None
         super(CellSet, self).__init__(id, neuroLexId, )
         self.select = _cast(None, select)
         if anytypeobjs_ is None:
@@ -5351,6 +4932,11 @@ class CellSet(Base):
         else:
             self.anytypeobjs_ = anytypeobjs_
     def factory(*args_, **kwargs_):
+        if CurrentSubclassModule_ is not None:
+            subclass = getSubclassFromModule_(
+                CurrentSubclassModule_, CellSet)
+            if subclass is not None:
+                return subclass(*args_, **kwargs_)
         if CellSet.subclass:
             return CellSet.subclass(*args_, **kwargs_)
         else:
@@ -5375,13 +4961,15 @@ class CellSet(Base):
             eol_ = '\n'
         else:
             eol_ = ''
+        if self.original_tagname_ is not None:
+            name_ = self.original_tagname_
         showIndent(outfile, level, pretty_print)
         outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
         already_processed = set()
         self.exportAttributes(outfile, level, already_processed, namespace_, name_='CellSet')
         if self.hasContent_():
             outfile.write('>%s' % (eol_, ))
-            self.exportChildren(outfile, level + 1, namespace_, name_, pretty_print=pretty_print)
+            self.exportChildren(outfile, level + 1, namespace_='', name_='CellSet', pretty_print=pretty_print)
             showIndent(outfile, level, pretty_print)
             outfile.write('</%s%s>%s' % (namespace_, name_, eol_))
         else:
@@ -5390,7 +4978,7 @@ class CellSet(Base):
         super(CellSet, self).exportAttributes(outfile, level, already_processed, namespace_, name_='CellSet')
         if self.select is not None and 'select' not in already_processed:
             already_processed.add('select')
-            outfile.write(' select=%s' % (self.gds_format_string(quote_attrib(self.select).encode(ExternalEncoding), input_name='select'), ))
+            outfile.write(' select=%s' % (self.gds_encode(self.gds_format_string(quote_attrib(self.select), input_name='select')), ))
     def exportChildren(self, outfile, level, namespace_='', name_='CellSet', fromsubclass_=False, pretty_print=True):
         super(CellSet, self).exportChildren(outfile, level, namespace_, name_, True, pretty_print=pretty_print)
         if pretty_print:
@@ -5399,34 +4987,13 @@ class CellSet(Base):
             eol_ = ''
         for obj_ in self.anytypeobjs_:
             obj_.export(outfile, level, namespace_, pretty_print=pretty_print)
-    def exportLiteral(self, outfile, level, name_='CellSet'):
-        level += 1
-        already_processed = set()
-        self.exportLiteralAttributes(outfile, level, already_processed, name_)
-        if self.hasContent_():
-            self.exportLiteralChildren(outfile, level, name_)
-    def exportLiteralAttributes(self, outfile, level, already_processed, name_):
-        if self.select is not None and 'select' not in already_processed:
-            already_processed.add('select')
-            showIndent(outfile, level)
-            outfile.write('select="%s",\n' % (self.select,))
-        super(CellSet, self).exportLiteralAttributes(outfile, level, already_processed, name_)
-    def exportLiteralChildren(self, outfile, level, name_):
-        super(CellSet, self).exportLiteralChildren(outfile, level, name_)
-        showIndent(outfile, level)
-        outfile.write('anytypeobjs_=[\n')
-        level += 1
-        for anytypeobjs_ in self.anytypeobjs_:
-            anytypeobjs_.exportLiteral(outfile, level)
-        level -= 1
-        showIndent(outfile, level)
-        outfile.write('],\n')
     def build(self, node):
         already_processed = set()
         self.buildAttributes(node, node.attrib, already_processed)
         for child in node:
             nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
             self.buildChildren(child, node, nodeName_)
+        return self
     def buildAttributes(self, node, attrs, already_processed):
         value = find_attr_value_('select', node)
         if value is not None and 'select' not in already_processed:
@@ -5444,20 +5011,26 @@ class CellSet(Base):
 class Population(Standalone):
     subclass = None
     superclass = Standalone
-    def __init__(self, id=None, neuroLexId=None, name=None, metaid=None, notes=None, annotation=None, extracellularProperties=None, network=None, component=None, cell=None, type_=None, size=None, layout=None, instance=None):
+    def __init__(self, id=None, neuroLexId=None, name=None, metaid=None, notes=None, annotation=None, cell=None, network=None, component=None, size=None, type_=None, extracellularProperties=None, layout=None, instance=None):
+        self.original_tagname_ = None
         super(Population, self).__init__(id, neuroLexId, name, metaid, notes, annotation, )
-        self.extracellularProperties = _cast(None, extracellularProperties)
+        self.cell = _cast(None, cell)
         self.network = _cast(None, network)
         self.component = _cast(None, component)
-        self.cell = _cast(None, cell)
-        self.type_ = _cast(None, type_)
         self.size = _cast(int, size)
+        self.type_ = _cast(None, type_)
+        self.extracellularProperties = _cast(None, extracellularProperties)
         self.layout = layout
         if instance is None:
             self.instance = []
         else:
             self.instance = instance
     def factory(*args_, **kwargs_):
+        if CurrentSubclassModule_ is not None:
+            subclass = getSubclassFromModule_(
+                CurrentSubclassModule_, Population)
+            if subclass is not None:
+                return subclass(*args_, **kwargs_)
         if Population.subclass:
             return Population.subclass(*args_, **kwargs_)
         else:
@@ -5468,25 +5041,39 @@ class Population(Standalone):
     def get_instance(self): return self.instance
     def set_instance(self, instance): self.instance = instance
     def add_instance(self, value): self.instance.append(value)
-    def insert_instance(self, index, value): self.instance[index] = value
-    def get_extracellularProperties(self): return self.extracellularProperties
-    def set_extracellularProperties(self, extracellularProperties): self.extracellularProperties = extracellularProperties
-    def validate_NmlId(self, value):
-        # Validate type NmlId, a restriction on xs:string.
-        pass
+    def insert_instance_at(self, index, value): self.instance.insert(index, value)
+    def replace_instance_at(self, index, value): self.instance[index] = value
+    def get_cell(self): return self.cell
+    def set_cell(self, cell): self.cell = cell
     def get_network(self): return self.network
     def set_network(self, network): self.network = network
     def get_component(self): return self.component
     def set_component(self, component): self.component = component
-    def get_cell(self): return self.cell
-    def set_cell(self, cell): self.cell = cell
+    def get_size(self): return self.size
+    def set_size(self, size): self.size = size
     def get_type(self): return self.type_
     def set_type(self, type_): self.type_ = type_
+    def get_extracellularProperties(self): return self.extracellularProperties
+    def set_extracellularProperties(self, extracellularProperties): self.extracellularProperties = extracellularProperties
+    def validate_NmlId(self, value):
+        # Validate type NmlId, a restriction on xs:string.
+        if value is not None and Validate_simpletypes_:
+            if not self.gds_validate_simple_patterns(
+                    self.validate_NmlId_patterns_, value):
+                warnings_.warn('Value "%s" does not match xsd pattern restrictions: %s' % (value.encode('utf-8'), self.validate_NmlId_patterns_, ))
+    validate_NmlId_patterns_ = [['^[a-zA-Z0-9_]*$']]
     def validate_populationTypes(self, value):
         # Validate type populationTypes, a restriction on xs:string.
-        pass
-    def get_size(self): return self.size
-    def set_size(self, size): self.size = size
+        if value is not None and Validate_simpletypes_:
+            value = str(value)
+            enumerations = ['population', 'populationList']
+            enumeration_respectee = False
+            for enum in enumerations:
+                if value == enum:
+                    enumeration_respectee = True
+                    break
+            if not enumeration_respectee:
+                warnings_.warn('Value "%(value)s" does not match xsd enumeration restriction on populationTypes' % {"value" : value.encode("utf-8")} )
     def hasContent_(self):
         if (
             self.layout is not None or
@@ -5501,37 +5088,39 @@ class Population(Standalone):
             eol_ = '\n'
         else:
             eol_ = ''
+        if self.original_tagname_ is not None:
+            name_ = self.original_tagname_
         showIndent(outfile, level, pretty_print)
         outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
         already_processed = set()
         self.exportAttributes(outfile, level, already_processed, namespace_, name_='Population')
         if self.hasContent_():
             outfile.write('>%s' % (eol_, ))
-            self.exportChildren(outfile, level + 1, namespace_, name_, pretty_print=pretty_print)
+            self.exportChildren(outfile, level + 1, namespace_='', name_='Population', pretty_print=pretty_print)
             showIndent(outfile, level, pretty_print)
             outfile.write('</%s%s>%s' % (namespace_, name_, eol_))
         else:
             outfile.write('/>%s' % (eol_, ))
     def exportAttributes(self, outfile, level, already_processed, namespace_='', name_='Population'):
         super(Population, self).exportAttributes(outfile, level, already_processed, namespace_, name_='Population')
-        if self.extracellularProperties is not None and 'extracellularProperties' not in already_processed:
-            already_processed.add('extracellularProperties')
-            outfile.write(' extracellularProperties=%s' % (quote_attrib(self.extracellularProperties), ))
+        if self.cell is not None and 'cell' not in already_processed:
+            already_processed.add('cell')
+            outfile.write(' cell=%s' % (quote_attrib(self.cell), ))
         if self.network is not None and 'network' not in already_processed:
             already_processed.add('network')
             outfile.write(' network=%s' % (quote_attrib(self.network), ))
         if self.component is not None and 'component' not in already_processed:
             already_processed.add('component')
             outfile.write(' component=%s' % (quote_attrib(self.component), ))
-        if self.cell is not None and 'cell' not in already_processed:
-            already_processed.add('cell')
-            outfile.write(' cell=%s' % (quote_attrib(self.cell), ))
-        if self.type_ is not None and 'type_' not in already_processed:
-            already_processed.add('type_')
-            outfile.write(' type=%s' % (quote_attrib(self.type_), ))
         if self.size is not None and 'size' not in already_processed:
             already_processed.add('size')
             outfile.write(' size="%s"' % self.gds_format_integer(self.size, input_name='size'))
+        if self.type_ is not None and 'type_' not in already_processed:
+            already_processed.add('type_')
+            outfile.write(' type=%s' % (quote_attrib(self.type_), ))
+        if self.extracellularProperties is not None and 'extracellularProperties' not in already_processed:
+            already_processed.add('extracellularProperties')
+            outfile.write(' extracellularProperties=%s' % (quote_attrib(self.extracellularProperties), ))
     def exportChildren(self, outfile, level, namespace_='', name_='Population', fromsubclass_=False, pretty_print=True):
         super(Population, self).exportChildren(outfile, level, namespace_, name_, True, pretty_print=pretty_print)
         if pretty_print:
@@ -5542,90 +5131,29 @@ class Population(Standalone):
             self.layout.export(outfile, level, namespace_, name_='layout', pretty_print=pretty_print)
         for instance_ in self.instance:
             instance_.export(outfile, level, namespace_, name_='instance', pretty_print=pretty_print)
-    def exportLiteral(self, outfile, level, name_='Population'):
-        level += 1
-        already_processed = set()
-        self.exportLiteralAttributes(outfile, level, already_processed, name_)
-        if self.hasContent_():
-            self.exportLiteralChildren(outfile, level, name_)
-    def exportLiteralAttributes(self, outfile, level, already_processed, name_):
-        if self.extracellularProperties is not None and 'extracellularProperties' not in already_processed:
-            already_processed.add('extracellularProperties')
-            showIndent(outfile, level)
-            outfile.write('extracellularProperties="%s",\n' % (self.extracellularProperties,))
-        if self.network is not None and 'network' not in already_processed:
-            already_processed.add('network')
-            showIndent(outfile, level)
-            outfile.write('network="%s",\n' % (self.network,))
-        if self.component is not None and 'component' not in already_processed:
-            already_processed.add('component')
-            showIndent(outfile, level)
-            outfile.write('component="%s",\n' % (self.component,))
-        if self.cell is not None and 'cell' not in already_processed:
-            already_processed.add('cell')
-            showIndent(outfile, level)
-            outfile.write('cell="%s",\n' % (self.cell,))
-        if self.type_ is not None and 'type_' not in already_processed:
-            already_processed.add('type_')
-            showIndent(outfile, level)
-            outfile.write('type_="%s",\n' % (self.type_,))
-        if self.size is not None and 'size' not in already_processed:
-            already_processed.add('size')
-            showIndent(outfile, level)
-            outfile.write('size=%d,\n' % (self.size,))
-        super(Population, self).exportLiteralAttributes(outfile, level, already_processed, name_)
-    def exportLiteralChildren(self, outfile, level, name_):
-        super(Population, self).exportLiteralChildren(outfile, level, name_)
-        if self.layout is not None:
-            showIndent(outfile, level)
-            outfile.write('layout=model_.Layout(\n')
-            self.layout.exportLiteral(outfile, level, name_='layout')
-            showIndent(outfile, level)
-            outfile.write('),\n')
-        showIndent(outfile, level)
-        outfile.write('instance=[\n')
-        level += 1
-        for instance_ in self.instance:
-            showIndent(outfile, level)
-            outfile.write('model_.Instance(\n')
-            instance_.exportLiteral(outfile, level, name_='Instance')
-            showIndent(outfile, level)
-            outfile.write('),\n')
-        level -= 1
-        showIndent(outfile, level)
-        outfile.write('],\n')
     def build(self, node):
         already_processed = set()
         self.buildAttributes(node, node.attrib, already_processed)
         for child in node:
             nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
             self.buildChildren(child, node, nodeName_)
+        return self
     def buildAttributes(self, node, attrs, already_processed):
-        value = find_attr_value_('extracellularProperties', node)
-        if value is not None and 'extracellularProperties' not in already_processed:
-            already_processed.add('extracellularProperties')
-            self.extracellularProperties = value
-            self.validate_NmlId(self.extracellularProperties)    # validate type NmlId
-        value = find_attr_value_('network', node)
-        if value is not None and 'network' not in already_processed:
-            already_processed.add('network')
-            self.network = value
+        value = find_attr_value_('cell', node)
+        if value is not None and 'cell' not in already_processed:
+            already_processed.add('cell')
+            self.cell = value
+            self.validate_NmlId(self.cell)    # validate type NmlId
+        value = find_attr_value_('network', node)
+        if value is not None and 'network' not in already_processed:
+            already_processed.add('network')
+            self.network = value
             self.validate_NmlId(self.network)    # validate type NmlId
         value = find_attr_value_('component', node)
         if value is not None and 'component' not in already_processed:
             already_processed.add('component')
             self.component = value
             self.validate_NmlId(self.component)    # validate type NmlId
-        value = find_attr_value_('cell', node)
-        if value is not None and 'cell' not in already_processed:
-            already_processed.add('cell')
-            self.cell = value
-            self.validate_NmlId(self.cell)    # validate type NmlId
-        value = find_attr_value_('type', node)
-        if value is not None and 'type' not in already_processed:
-            already_processed.add('type')
-            self.type_ = value
-            self.validate_populationTypes(self.type_)    # validate type populationTypes
         value = find_attr_value_('size', node)
         if value is not None and 'size' not in already_processed:
             already_processed.add('size')
@@ -5633,16 +5161,28 @@ class Population(Standalone):
                 self.size = int(value)
             except ValueError as exp:
                 raise_parse_error(node, 'Bad integer attribute: %s' % exp)
+        value = find_attr_value_('type', node)
+        if value is not None and 'type' not in already_processed:
+            already_processed.add('type')
+            self.type_ = value
+            self.validate_populationTypes(self.type_)    # validate type populationTypes
+        value = find_attr_value_('extracellularProperties', node)
+        if value is not None and 'extracellularProperties' not in already_processed:
+            already_processed.add('extracellularProperties')
+            self.extracellularProperties = value
+            self.validate_NmlId(self.extracellularProperties)    # validate type NmlId
         super(Population, self).buildAttributes(node, attrs, already_processed)
     def buildChildren(self, child_, node, nodeName_, fromsubclass_=False):
         if nodeName_ == 'layout':
             obj_ = Layout.factory()
             obj_.build(child_)
-            self.set_layout(obj_)
+            self.layout = obj_
+            obj_.original_tagname_ = 'layout'
         elif nodeName_ == 'instance':
             obj_ = Instance.factory()
             obj_.build(child_)
             self.instance.append(obj_)
+            obj_.original_tagname_ = 'instance'
         super(Population, self).buildChildren(child_, node, nodeName_, True)
 # end class Population
 
@@ -5651,6 +5191,7 @@ class Region(Base):
     subclass = None
     superclass = Base
     def __init__(self, id=None, neuroLexId=None, space=None, anytypeobjs_=None):
+        self.original_tagname_ = None
         super(Region, self).__init__(id, neuroLexId, )
         self.space = _cast(None, space)
         if anytypeobjs_ is None:
@@ -5658,6 +5199,11 @@ class Region(Base):
         else:
             self.anytypeobjs_ = anytypeobjs_
     def factory(*args_, **kwargs_):
+        if CurrentSubclassModule_ is not None:
+            subclass = getSubclassFromModule_(
+                CurrentSubclassModule_, Region)
+            if subclass is not None:
+                return subclass(*args_, **kwargs_)
         if Region.subclass:
             return Region.subclass(*args_, **kwargs_)
         else:
@@ -5671,7 +5217,11 @@ class Region(Base):
     def set_space(self, space): self.space = space
     def validate_NmlId(self, value):
         # Validate type NmlId, a restriction on xs:string.
-        pass
+        if value is not None and Validate_simpletypes_:
+            if not self.gds_validate_simple_patterns(
+                    self.validate_NmlId_patterns_, value):
+                warnings_.warn('Value "%s" does not match xsd pattern restrictions: %s' % (value.encode('utf-8'), self.validate_NmlId_patterns_, ))
+    validate_NmlId_patterns_ = [['^[a-zA-Z0-9_]*$']]
     def hasContent_(self):
         if (
             self.anytypeobjs_ or
@@ -5685,13 +5235,15 @@ class Region(Base):
             eol_ = '\n'
         else:
             eol_ = ''
+        if self.original_tagname_ is not None:
+            name_ = self.original_tagname_
         showIndent(outfile, level, pretty_print)
         outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
         already_processed = set()
         self.exportAttributes(outfile, level, already_processed, namespace_, name_='Region')
         if self.hasContent_():
             outfile.write('>%s' % (eol_, ))
-            self.exportChildren(outfile, level + 1, namespace_, name_, pretty_print=pretty_print)
+            self.exportChildren(outfile, level + 1, namespace_='', name_='Region', pretty_print=pretty_print)
             showIndent(outfile, level, pretty_print)
             outfile.write('</%s%s>%s' % (namespace_, name_, eol_))
         else:
@@ -5709,34 +5261,13 @@ class Region(Base):
             eol_ = ''
         for obj_ in self.anytypeobjs_:
             obj_.export(outfile, level, namespace_, pretty_print=pretty_print)
-    def exportLiteral(self, outfile, level, name_='Region'):
-        level += 1
-        already_processed = set()
-        self.exportLiteralAttributes(outfile, level, already_processed, name_)
-        if self.hasContent_():
-            self.exportLiteralChildren(outfile, level, name_)
-    def exportLiteralAttributes(self, outfile, level, already_processed, name_):
-        if self.space is not None and 'space' not in already_processed:
-            already_processed.add('space')
-            showIndent(outfile, level)
-            outfile.write('space="%s",\n' % (self.space,))
-        super(Region, self).exportLiteralAttributes(outfile, level, already_processed, name_)
-    def exportLiteralChildren(self, outfile, level, name_):
-        super(Region, self).exportLiteralChildren(outfile, level, name_)
-        showIndent(outfile, level)
-        outfile.write('anytypeobjs_=[\n')
-        level += 1
-        for anytypeobjs_ in self.anytypeobjs_:
-            anytypeobjs_.exportLiteral(outfile, level)
-        level -= 1
-        showIndent(outfile, level)
-        outfile.write('],\n')
     def build(self, node):
         already_processed = set()
         self.buildAttributes(node, node.attrib, already_processed)
         for child in node:
             nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
             self.buildChildren(child, node, nodeName_)
+        return self
     def buildAttributes(self, node, attrs, already_processed):
         value = find_attr_value_('space', node)
         if value is not None and 'space' not in already_processed:
@@ -5756,10 +5287,16 @@ class Space(Base):
     subclass = None
     superclass = Base
     def __init__(self, id=None, neuroLexId=None, basedOn=None, structure=None):
+        self.original_tagname_ = None
         super(Space, self).__init__(id, neuroLexId, )
         self.basedOn = _cast(None, basedOn)
         self.structure = structure
     def factory(*args_, **kwargs_):
+        if CurrentSubclassModule_ is not None:
+            subclass = getSubclassFromModule_(
+                CurrentSubclassModule_, Space)
+            if subclass is not None:
+                return subclass(*args_, **kwargs_)
         if Space.subclass:
             return Space.subclass(*args_, **kwargs_)
         else:
@@ -5771,7 +5308,16 @@ class Space(Base):
     def set_basedOn(self, basedOn): self.basedOn = basedOn
     def validate_allowedSpaces(self, value):
         # Validate type allowedSpaces, a restriction on xs:string.
-        pass
+        if value is not None and Validate_simpletypes_:
+            value = str(value)
+            enumerations = ['Euclidean_1D', 'Euclidean_2D', 'Euclidean_3D', 'Grid_1D', 'Grid_2D', 'Grid_3D']
+            enumeration_respectee = False
+            for enum in enumerations:
+                if value == enum:
+                    enumeration_respectee = True
+                    break
+            if not enumeration_respectee:
+                warnings_.warn('Value "%(value)s" does not match xsd enumeration restriction on allowedSpaces' % {"value" : value.encode("utf-8")} )
     def hasContent_(self):
         if (
             self.structure is not None or
@@ -5785,13 +5331,15 @@ class Space(Base):
             eol_ = '\n'
         else:
             eol_ = ''
+        if self.original_tagname_ is not None:
+            name_ = self.original_tagname_
         showIndent(outfile, level, pretty_print)
         outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
         already_processed = set()
         self.exportAttributes(outfile, level, already_processed, namespace_, name_='Space')
         if self.hasContent_():
             outfile.write('>%s' % (eol_, ))
-            self.exportChildren(outfile, level + 1, namespace_, name_, pretty_print=pretty_print)
+            self.exportChildren(outfile, level + 1, namespace_='', name_='Space', pretty_print=pretty_print)
             showIndent(outfile, level, pretty_print)
             outfile.write('</%s%s>%s' % (namespace_, name_, eol_))
         else:
@@ -5809,32 +5357,13 @@ class Space(Base):
             eol_ = ''
         if self.structure is not None:
             self.structure.export(outfile, level, namespace_, name_='structure', pretty_print=pretty_print)
-    def exportLiteral(self, outfile, level, name_='Space'):
-        level += 1
-        already_processed = set()
-        self.exportLiteralAttributes(outfile, level, already_processed, name_)
-        if self.hasContent_():
-            self.exportLiteralChildren(outfile, level, name_)
-    def exportLiteralAttributes(self, outfile, level, already_processed, name_):
-        if self.basedOn is not None and 'basedOn' not in already_processed:
-            already_processed.add('basedOn')
-            showIndent(outfile, level)
-            outfile.write('basedOn="%s",\n' % (self.basedOn,))
-        super(Space, self).exportLiteralAttributes(outfile, level, already_processed, name_)
-    def exportLiteralChildren(self, outfile, level, name_):
-        super(Space, self).exportLiteralChildren(outfile, level, name_)
-        if self.structure is not None:
-            showIndent(outfile, level)
-            outfile.write('structure=model_.SpaceStructure(\n')
-            self.structure.exportLiteral(outfile, level, name_='structure')
-            showIndent(outfile, level)
-            outfile.write('),\n')
     def build(self, node):
         already_processed = set()
         self.buildAttributes(node, node.attrib, already_processed)
         for child in node:
             nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
             self.buildChildren(child, node, nodeName_)
+        return self
     def buildAttributes(self, node, attrs, already_processed):
         value = find_attr_value_('basedOn', node)
         if value is not None and 'basedOn' not in already_processed:
@@ -5846,7 +5375,8 @@ class Space(Base):
         if nodeName_ == 'structure':
             obj_ = SpaceStructure.factory()
             obj_.build(child_)
-            self.set_structure(obj_)
+            self.structure = obj_
+            obj_.original_tagname_ = 'structure'
         super(Space, self).buildChildren(child_, node, nodeName_, True)
 # end class Space
 
@@ -5855,6 +5385,7 @@ class Network(Standalone):
     subclass = None
     superclass = Standalone
     def __init__(self, id=None, neuroLexId=None, name=None, metaid=None, notes=None, annotation=None, space=None, region=None, extracellularProperties=None, population=None, cellSet=None, synapticConnection=None, projection=None, explicitInput=None, inputList=None):
+        self.original_tagname_ = None
         super(Network, self).__init__(id, neuroLexId, name, metaid, notes, annotation, )
         if space is None:
             self.space = []
@@ -5893,6 +5424,11 @@ class Network(Standalone):
         else:
             self.inputList = inputList
     def factory(*args_, **kwargs_):
+        if CurrentSubclassModule_ is not None:
+            subclass = getSubclassFromModule_(
+                CurrentSubclassModule_, Network)
+            if subclass is not None:
+                return subclass(*args_, **kwargs_)
         if Network.subclass:
             return Network.subclass(*args_, **kwargs_)
         else:
@@ -5901,39 +5437,48 @@ class Network(Standalone):
     def get_space(self): return self.space
     def set_space(self, space): self.space = space
     def add_space(self, value): self.space.append(value)
-    def insert_space(self, index, value): self.space[index] = value
+    def insert_space_at(self, index, value): self.space.insert(index, value)
+    def replace_space_at(self, index, value): self.space[index] = value
     def get_region(self): return self.region
     def set_region(self, region): self.region = region
     def add_region(self, value): self.region.append(value)
-    def insert_region(self, index, value): self.region[index] = value
+    def insert_region_at(self, index, value): self.region.insert(index, value)
+    def replace_region_at(self, index, value): self.region[index] = value
     def get_extracellularProperties(self): return self.extracellularProperties
     def set_extracellularProperties(self, extracellularProperties): self.extracellularProperties = extracellularProperties
     def add_extracellularProperties(self, value): self.extracellularProperties.append(value)
-    def insert_extracellularProperties(self, index, value): self.extracellularProperties[index] = value
+    def insert_extracellularProperties_at(self, index, value): self.extracellularProperties.insert(index, value)
+    def replace_extracellularProperties_at(self, index, value): self.extracellularProperties[index] = value
     def get_population(self): return self.population
     def set_population(self, population): self.population = population
     def add_population(self, value): self.population.append(value)
-    def insert_population(self, index, value): self.population[index] = value
+    def insert_population_at(self, index, value): self.population.insert(index, value)
+    def replace_population_at(self, index, value): self.population[index] = value
     def get_cellSet(self): return self.cellSet
     def set_cellSet(self, cellSet): self.cellSet = cellSet
     def add_cellSet(self, value): self.cellSet.append(value)
-    def insert_cellSet(self, index, value): self.cellSet[index] = value
+    def insert_cellSet_at(self, index, value): self.cellSet.insert(index, value)
+    def replace_cellSet_at(self, index, value): self.cellSet[index] = value
     def get_synapticConnection(self): return self.synapticConnection
     def set_synapticConnection(self, synapticConnection): self.synapticConnection = synapticConnection
     def add_synapticConnection(self, value): self.synapticConnection.append(value)
-    def insert_synapticConnection(self, index, value): self.synapticConnection[index] = value
+    def insert_synapticConnection_at(self, index, value): self.synapticConnection.insert(index, value)
+    def replace_synapticConnection_at(self, index, value): self.synapticConnection[index] = value
     def get_projection(self): return self.projection
     def set_projection(self, projection): self.projection = projection
     def add_projection(self, value): self.projection.append(value)
-    def insert_projection(self, index, value): self.projection[index] = value
+    def insert_projection_at(self, index, value): self.projection.insert(index, value)
+    def replace_projection_at(self, index, value): self.projection[index] = value
     def get_explicitInput(self): return self.explicitInput
     def set_explicitInput(self, explicitInput): self.explicitInput = explicitInput
     def add_explicitInput(self, value): self.explicitInput.append(value)
-    def insert_explicitInput(self, index, value): self.explicitInput[index] = value
+    def insert_explicitInput_at(self, index, value): self.explicitInput.insert(index, value)
+    def replace_explicitInput_at(self, index, value): self.explicitInput[index] = value
     def get_inputList(self): return self.inputList
     def set_inputList(self, inputList): self.inputList = inputList
     def add_inputList(self, value): self.inputList.append(value)
-    def insert_inputList(self, index, value): self.inputList[index] = value
+    def insert_inputList_at(self, index, value): self.inputList.insert(index, value)
+    def replace_inputList_at(self, index, value): self.inputList[index] = value
     def hasContent_(self):
         if (
             self.space or
@@ -5955,13 +5500,15 @@ class Network(Standalone):
             eol_ = '\n'
         else:
             eol_ = ''
+        if self.original_tagname_ is not None:
+            name_ = self.original_tagname_
         showIndent(outfile, level, pretty_print)
         outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
         already_processed = set()
         self.exportAttributes(outfile, level, already_processed, namespace_, name_='Network')
         if self.hasContent_():
             outfile.write('>%s' % (eol_, ))
-            self.exportChildren(outfile, level + 1, namespace_, name_, pretty_print=pretty_print)
+            self.exportChildren(outfile, level + 1, namespace_='', name_='Network', pretty_print=pretty_print)
             showIndent(outfile, level, pretty_print)
             outfile.write('</%s%s>%s' % (namespace_, name_, eol_))
         else:
@@ -5992,130 +5539,13 @@ class Network(Standalone):
             explicitInput_.export(outfile, level, namespace_, name_='explicitInput', pretty_print=pretty_print)
         for inputList_ in self.inputList:
             inputList_.export(outfile, level, namespace_, name_='inputList', pretty_print=pretty_print)
-    def exportLiteral(self, outfile, level, name_='Network'):
-        level += 1
-        already_processed = set()
-        self.exportLiteralAttributes(outfile, level, already_processed, name_)
-        if self.hasContent_():
-            self.exportLiteralChildren(outfile, level, name_)
-    def exportLiteralAttributes(self, outfile, level, already_processed, name_):
-        super(Network, self).exportLiteralAttributes(outfile, level, already_processed, name_)
-    def exportLiteralChildren(self, outfile, level, name_):
-        super(Network, self).exportLiteralChildren(outfile, level, name_)
-        showIndent(outfile, level)
-        outfile.write('space=[\n')
-        level += 1
-        for space_ in self.space:
-            showIndent(outfile, level)
-            outfile.write('model_.Space(\n')
-            space_.exportLiteral(outfile, level, name_='Space')
-            showIndent(outfile, level)
-            outfile.write('),\n')
-        level -= 1
-        showIndent(outfile, level)
-        outfile.write('],\n')
-        showIndent(outfile, level)
-        outfile.write('region=[\n')
-        level += 1
-        for region_ in self.region:
-            showIndent(outfile, level)
-            outfile.write('model_.Region(\n')
-            region_.exportLiteral(outfile, level, name_='Region')
-            showIndent(outfile, level)
-            outfile.write('),\n')
-        level -= 1
-        showIndent(outfile, level)
-        outfile.write('],\n')
-        showIndent(outfile, level)
-        outfile.write('extracellularProperties=[\n')
-        level += 1
-        for extracellularProperties_ in self.extracellularProperties:
-            showIndent(outfile, level)
-            outfile.write('model_.ExtracellularPropertiesLocal(\n')
-            extracellularProperties_.exportLiteral(outfile, level, name_='ExtracellularPropertiesLocal')
-            showIndent(outfile, level)
-            outfile.write('),\n')
-        level -= 1
-        showIndent(outfile, level)
-        outfile.write('],\n')
-        showIndent(outfile, level)
-        outfile.write('population=[\n')
-        level += 1
-        for population_ in self.population:
-            showIndent(outfile, level)
-            outfile.write('model_.Population(\n')
-            population_.exportLiteral(outfile, level, name_='Population')
-            showIndent(outfile, level)
-            outfile.write('),\n')
-        level -= 1
-        showIndent(outfile, level)
-        outfile.write('],\n')
-        showIndent(outfile, level)
-        outfile.write('cellSet=[\n')
-        level += 1
-        for cellSet_ in self.cellSet:
-            showIndent(outfile, level)
-            outfile.write('model_.CellSet(\n')
-            cellSet_.exportLiteral(outfile, level, name_='CellSet')
-            showIndent(outfile, level)
-            outfile.write('),\n')
-        level -= 1
-        showIndent(outfile, level)
-        outfile.write('],\n')
-        showIndent(outfile, level)
-        outfile.write('synapticConnection=[\n')
-        level += 1
-        for synapticConnection_ in self.synapticConnection:
-            showIndent(outfile, level)
-            outfile.write('model_.SynapticConnection(\n')
-            synapticConnection_.exportLiteral(outfile, level, name_='SynapticConnection')
-            showIndent(outfile, level)
-            outfile.write('),\n')
-        level -= 1
-        showIndent(outfile, level)
-        outfile.write('],\n')
-        showIndent(outfile, level)
-        outfile.write('projection=[\n')
-        level += 1
-        for projection_ in self.projection:
-            showIndent(outfile, level)
-            outfile.write('model_.Projection(\n')
-            projection_.exportLiteral(outfile, level, name_='Projection')
-            showIndent(outfile, level)
-            outfile.write('),\n')
-        level -= 1
-        showIndent(outfile, level)
-        outfile.write('],\n')
-        showIndent(outfile, level)
-        outfile.write('explicitInput=[\n')
-        level += 1
-        for explicitInput_ in self.explicitInput:
-            showIndent(outfile, level)
-            outfile.write('model_.ExplicitInput(\n')
-            explicitInput_.exportLiteral(outfile, level, name_='ExplicitInput')
-            showIndent(outfile, level)
-            outfile.write('),\n')
-        level -= 1
-        showIndent(outfile, level)
-        outfile.write('],\n')
-        showIndent(outfile, level)
-        outfile.write('inputList=[\n')
-        level += 1
-        for inputList_ in self.inputList:
-            showIndent(outfile, level)
-            outfile.write('model_.InputList(\n')
-            inputList_.exportLiteral(outfile, level, name_='InputList')
-            showIndent(outfile, level)
-            outfile.write('),\n')
-        level -= 1
-        showIndent(outfile, level)
-        outfile.write('],\n')
     def build(self, node):
         already_processed = set()
         self.buildAttributes(node, node.attrib, already_processed)
         for child in node:
             nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
             self.buildChildren(child, node, nodeName_)
+        return self
     def buildAttributes(self, node, attrs, already_processed):
         super(Network, self).buildAttributes(node, attrs, already_processed)
     def buildChildren(self, child_, node, nodeName_, fromsubclass_=False):
@@ -6123,38 +5553,47 @@ class Network(Standalone):
             obj_ = Space.factory()
             obj_.build(child_)
             self.space.append(obj_)
+            obj_.original_tagname_ = 'space'
         elif nodeName_ == 'region':
             obj_ = Region.factory()
             obj_.build(child_)
             self.region.append(obj_)
+            obj_.original_tagname_ = 'region'
         elif nodeName_ == 'extracellularProperties':
             obj_ = ExtracellularPropertiesLocal.factory()
             obj_.build(child_)
             self.extracellularProperties.append(obj_)
+            obj_.original_tagname_ = 'extracellularProperties'
         elif nodeName_ == 'population':
             obj_ = Population.factory()
             obj_.build(child_)
             self.population.append(obj_)
+            obj_.original_tagname_ = 'population'
         elif nodeName_ == 'cellSet':
             obj_ = CellSet.factory()
             obj_.build(child_)
             self.cellSet.append(obj_)
+            obj_.original_tagname_ = 'cellSet'
         elif nodeName_ == 'synapticConnection':
             obj_ = SynapticConnection.factory()
             obj_.build(child_)
             self.synapticConnection.append(obj_)
+            obj_.original_tagname_ = 'synapticConnection'
         elif nodeName_ == 'projection':
             obj_ = Projection.factory()
             obj_.build(child_)
             self.projection.append(obj_)
+            obj_.original_tagname_ = 'projection'
         elif nodeName_ == 'explicitInput':
             obj_ = ExplicitInput.factory()
             obj_.build(child_)
             self.explicitInput.append(obj_)
+            obj_.original_tagname_ = 'explicitInput'
         elif nodeName_ == 'inputList':
             obj_ = InputList.factory()
             obj_.build(child_)
             self.inputList.append(obj_)
+            obj_.original_tagname_ = 'inputList'
         super(Network, self).buildChildren(child_, node, nodeName_, True)
 # end class Network
 
@@ -6163,10 +5602,15 @@ class SpikeGeneratorPoisson(Standalone):
     subclass = None
     superclass = Standalone
     def __init__(self, id=None, neuroLexId=None, name=None, metaid=None, notes=None, annotation=None, averageRate=None):
+        self.original_tagname_ = None
         super(SpikeGeneratorPoisson, self).__init__(id, neuroLexId, name, metaid, notes, annotation, )
         self.averageRate = _cast(None, averageRate)
-        pass
     def factory(*args_, **kwargs_):
+        if CurrentSubclassModule_ is not None:
+            subclass = getSubclassFromModule_(
+                CurrentSubclassModule_, SpikeGeneratorPoisson)
+            if subclass is not None:
+                return subclass(*args_, **kwargs_)
         if SpikeGeneratorPoisson.subclass:
             return SpikeGeneratorPoisson.subclass(*args_, **kwargs_)
         else:
@@ -6176,7 +5620,11 @@ class SpikeGeneratorPoisson(Standalone):
     def set_averageRate(self, averageRate): self.averageRate = averageRate
     def validate_Nml2Quantity_pertime(self, value):
         # Validate type Nml2Quantity_pertime, a restriction on xs:string.
-        pass
+        if value is not None and Validate_simpletypes_:
+            if not self.gds_validate_simple_patterns(
+                    self.validate_Nml2Quantity_pertime_patterns_, value):
+                warnings_.warn('Value "%s" does not match xsd pattern restrictions: %s' % (value.encode('utf-8'), self.validate_Nml2Quantity_pertime_patterns_, ))
+    validate_Nml2Quantity_pertime_patterns_ = [['^-?([0-9]*(\\.[0-9]+)?)([eE]-?[0-9]+)?[\\s]*(per_s|per_ms|Hz)$']]
     def hasContent_(self):
         if (
             super(SpikeGeneratorPoisson, self).hasContent_()
@@ -6189,13 +5637,15 @@ class SpikeGeneratorPoisson(Standalone):
             eol_ = '\n'
         else:
             eol_ = ''
+        if self.original_tagname_ is not None:
+            name_ = self.original_tagname_
         showIndent(outfile, level, pretty_print)
         outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
         already_processed = set()
         self.exportAttributes(outfile, level, already_processed, namespace_, name_='SpikeGeneratorPoisson')
         if self.hasContent_():
             outfile.write('>%s' % (eol_, ))
-            self.exportChildren(outfile, level + 1, namespace_, name_, pretty_print=pretty_print)
+            self.exportChildren(outfile, level + 1, namespace_='', name_='SpikeGeneratorPoisson', pretty_print=pretty_print)
             showIndent(outfile, level, pretty_print)
             outfile.write('</%s%s>%s' % (namespace_, name_, eol_))
         else:
@@ -6207,26 +5657,13 @@ class SpikeGeneratorPoisson(Standalone):
             outfile.write(' averageRate=%s' % (quote_attrib(self.averageRate), ))
     def exportChildren(self, outfile, level, namespace_='', name_='SpikeGeneratorPoisson', fromsubclass_=False, pretty_print=True):
         super(SpikeGeneratorPoisson, self).exportChildren(outfile, level, namespace_, name_, True, pretty_print=pretty_print)
-    def exportLiteral(self, outfile, level, name_='SpikeGeneratorPoisson'):
-        level += 1
-        already_processed = set()
-        self.exportLiteralAttributes(outfile, level, already_processed, name_)
-        if self.hasContent_():
-            self.exportLiteralChildren(outfile, level, name_)
-    def exportLiteralAttributes(self, outfile, level, already_processed, name_):
-        if self.averageRate is not None and 'averageRate' not in already_processed:
-            already_processed.add('averageRate')
-            showIndent(outfile, level)
-            outfile.write('averageRate="%s",\n' % (self.averageRate,))
-        super(SpikeGeneratorPoisson, self).exportLiteralAttributes(outfile, level, already_processed, name_)
-    def exportLiteralChildren(self, outfile, level, name_):
-        super(SpikeGeneratorPoisson, self).exportLiteralChildren(outfile, level, name_)
     def build(self, node):
         already_processed = set()
         self.buildAttributes(node, node.attrib, already_processed)
         for child in node:
             nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
             self.buildChildren(child, node, nodeName_)
+        return self
     def buildAttributes(self, node, attrs, already_processed):
         value = find_attr_value_('averageRate', node)
         if value is not None and 'averageRate' not in already_processed:
@@ -6243,24 +5680,33 @@ class SpikeGeneratorPoisson(Standalone):
 class SpikeGeneratorRandom(Standalone):
     subclass = None
     superclass = Standalone
-    def __init__(self, id=None, neuroLexId=None, name=None, metaid=None, notes=None, annotation=None, minISI=None, maxISI=None):
+    def __init__(self, id=None, neuroLexId=None, name=None, metaid=None, notes=None, annotation=None, maxISI=None, minISI=None):
+        self.original_tagname_ = None
         super(SpikeGeneratorRandom, self).__init__(id, neuroLexId, name, metaid, notes, annotation, )
-        self.minISI = _cast(None, minISI)
         self.maxISI = _cast(None, maxISI)
-        pass
+        self.minISI = _cast(None, minISI)
     def factory(*args_, **kwargs_):
+        if CurrentSubclassModule_ is not None:
+            subclass = getSubclassFromModule_(
+                CurrentSubclassModule_, SpikeGeneratorRandom)
+            if subclass is not None:
+                return subclass(*args_, **kwargs_)
         if SpikeGeneratorRandom.subclass:
             return SpikeGeneratorRandom.subclass(*args_, **kwargs_)
         else:
             return SpikeGeneratorRandom(*args_, **kwargs_)
     factory = staticmethod(factory)
+    def get_maxISI(self): return self.maxISI
+    def set_maxISI(self, maxISI): self.maxISI = maxISI
     def get_minISI(self): return self.minISI
     def set_minISI(self, minISI): self.minISI = minISI
     def validate_Nml2Quantity_time(self, value):
         # Validate type Nml2Quantity_time, a restriction on xs:string.
-        pass
-    def get_maxISI(self): return self.maxISI
-    def set_maxISI(self, maxISI): self.maxISI = maxISI
+        if value is not None and Validate_simpletypes_:
+            if not self.gds_validate_simple_patterns(
+                    self.validate_Nml2Quantity_time_patterns_, value):
+                warnings_.warn('Value "%s" does not match xsd pattern restrictions: %s' % (value.encode('utf-8'), self.validate_Nml2Quantity_time_patterns_, ))
+    validate_Nml2Quantity_time_patterns_ = [['^-?([0-9]*(\\.[0-9]+)?)([eE]-?[0-9]+)?[\\s]*(s|ms)$']]
     def hasContent_(self):
         if (
             super(SpikeGeneratorRandom, self).hasContent_()
@@ -6273,62 +5719,47 @@ class SpikeGeneratorRandom(Standalone):
             eol_ = '\n'
         else:
             eol_ = ''
+        if self.original_tagname_ is not None:
+            name_ = self.original_tagname_
         showIndent(outfile, level, pretty_print)
         outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
         already_processed = set()
         self.exportAttributes(outfile, level, already_processed, namespace_, name_='SpikeGeneratorRandom')
         if self.hasContent_():
             outfile.write('>%s' % (eol_, ))
-            self.exportChildren(outfile, level + 1, namespace_, name_, pretty_print=pretty_print)
+            self.exportChildren(outfile, level + 1, namespace_='', name_='SpikeGeneratorRandom', pretty_print=pretty_print)
             showIndent(outfile, level, pretty_print)
             outfile.write('</%s%s>%s' % (namespace_, name_, eol_))
         else:
             outfile.write('/>%s' % (eol_, ))
     def exportAttributes(self, outfile, level, already_processed, namespace_='', name_='SpikeGeneratorRandom'):
         super(SpikeGeneratorRandom, self).exportAttributes(outfile, level, already_processed, namespace_, name_='SpikeGeneratorRandom')
-        if self.minISI is not None and 'minISI' not in already_processed:
-            already_processed.add('minISI')
-            outfile.write(' minISI=%s' % (quote_attrib(self.minISI), ))
         if self.maxISI is not None and 'maxISI' not in already_processed:
             already_processed.add('maxISI')
             outfile.write(' maxISI=%s' % (quote_attrib(self.maxISI), ))
-    def exportChildren(self, outfile, level, namespace_='', name_='SpikeGeneratorRandom', fromsubclass_=False, pretty_print=True):
-        super(SpikeGeneratorRandom, self).exportChildren(outfile, level, namespace_, name_, True, pretty_print=pretty_print)
-    def exportLiteral(self, outfile, level, name_='SpikeGeneratorRandom'):
-        level += 1
-        already_processed = set()
-        self.exportLiteralAttributes(outfile, level, already_processed, name_)
-        if self.hasContent_():
-            self.exportLiteralChildren(outfile, level, name_)
-    def exportLiteralAttributes(self, outfile, level, already_processed, name_):
         if self.minISI is not None and 'minISI' not in already_processed:
             already_processed.add('minISI')
-            showIndent(outfile, level)
-            outfile.write('minISI="%s",\n' % (self.minISI,))
-        if self.maxISI is not None and 'maxISI' not in already_processed:
-            already_processed.add('maxISI')
-            showIndent(outfile, level)
-            outfile.write('maxISI="%s",\n' % (self.maxISI,))
-        super(SpikeGeneratorRandom, self).exportLiteralAttributes(outfile, level, already_processed, name_)
-    def exportLiteralChildren(self, outfile, level, name_):
-        super(SpikeGeneratorRandom, self).exportLiteralChildren(outfile, level, name_)
+            outfile.write(' minISI=%s' % (quote_attrib(self.minISI), ))
+    def exportChildren(self, outfile, level, namespace_='', name_='SpikeGeneratorRandom', fromsubclass_=False, pretty_print=True):
+        super(SpikeGeneratorRandom, self).exportChildren(outfile, level, namespace_, name_, True, pretty_print=pretty_print)
     def build(self, node):
         already_processed = set()
         self.buildAttributes(node, node.attrib, already_processed)
         for child in node:
             nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
             self.buildChildren(child, node, nodeName_)
+        return self
     def buildAttributes(self, node, attrs, already_processed):
-        value = find_attr_value_('minISI', node)
-        if value is not None and 'minISI' not in already_processed:
-            already_processed.add('minISI')
-            self.minISI = value
-            self.validate_Nml2Quantity_time(self.minISI)    # validate type Nml2Quantity_time
         value = find_attr_value_('maxISI', node)
         if value is not None and 'maxISI' not in already_processed:
             already_processed.add('maxISI')
             self.maxISI = value
             self.validate_Nml2Quantity_time(self.maxISI)    # validate type Nml2Quantity_time
+        value = find_attr_value_('minISI', node)
+        if value is not None and 'minISI' not in already_processed:
+            already_processed.add('minISI')
+            self.minISI = value
+            self.validate_Nml2Quantity_time(self.minISI)    # validate type Nml2Quantity_time
         super(SpikeGeneratorRandom, self).buildAttributes(node, attrs, already_processed)
     def buildChildren(self, child_, node, nodeName_, fromsubclass_=False):
         super(SpikeGeneratorRandom, self).buildChildren(child_, node, nodeName_, True)
@@ -6340,10 +5771,15 @@ class SpikeGenerator(Standalone):
     subclass = None
     superclass = Standalone
     def __init__(self, id=None, neuroLexId=None, name=None, metaid=None, notes=None, annotation=None, period=None):
+        self.original_tagname_ = None
         super(SpikeGenerator, self).__init__(id, neuroLexId, name, metaid, notes, annotation, )
         self.period = _cast(None, period)
-        pass
     def factory(*args_, **kwargs_):
+        if CurrentSubclassModule_ is not None:
+            subclass = getSubclassFromModule_(
+                CurrentSubclassModule_, SpikeGenerator)
+            if subclass is not None:
+                return subclass(*args_, **kwargs_)
         if SpikeGenerator.subclass:
             return SpikeGenerator.subclass(*args_, **kwargs_)
         else:
@@ -6353,7 +5789,11 @@ class SpikeGenerator(Standalone):
     def set_period(self, period): self.period = period
     def validate_Nml2Quantity_time(self, value):
         # Validate type Nml2Quantity_time, a restriction on xs:string.
-        pass
+        if value is not None and Validate_simpletypes_:
+            if not self.gds_validate_simple_patterns(
+                    self.validate_Nml2Quantity_time_patterns_, value):
+                warnings_.warn('Value "%s" does not match xsd pattern restrictions: %s' % (value.encode('utf-8'), self.validate_Nml2Quantity_time_patterns_, ))
+    validate_Nml2Quantity_time_patterns_ = [['^-?([0-9]*(\\.[0-9]+)?)([eE]-?[0-9]+)?[\\s]*(s|ms)$']]
     def hasContent_(self):
         if (
             super(SpikeGenerator, self).hasContent_()
@@ -6366,13 +5806,15 @@ class SpikeGenerator(Standalone):
             eol_ = '\n'
         else:
             eol_ = ''
+        if self.original_tagname_ is not None:
+            name_ = self.original_tagname_
         showIndent(outfile, level, pretty_print)
         outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
         already_processed = set()
         self.exportAttributes(outfile, level, already_processed, namespace_, name_='SpikeGenerator')
         if self.hasContent_():
             outfile.write('>%s' % (eol_, ))
-            self.exportChildren(outfile, level + 1, namespace_, name_, pretty_print=pretty_print)
+            self.exportChildren(outfile, level + 1, namespace_='', name_='SpikeGenerator', pretty_print=pretty_print)
             showIndent(outfile, level, pretty_print)
             outfile.write('</%s%s>%s' % (namespace_, name_, eol_))
         else:
@@ -6384,26 +5826,13 @@ class SpikeGenerator(Standalone):
             outfile.write(' period=%s' % (quote_attrib(self.period), ))
     def exportChildren(self, outfile, level, namespace_='', name_='SpikeGenerator', fromsubclass_=False, pretty_print=True):
         super(SpikeGenerator, self).exportChildren(outfile, level, namespace_, name_, True, pretty_print=pretty_print)
-    def exportLiteral(self, outfile, level, name_='SpikeGenerator'):
-        level += 1
-        already_processed = set()
-        self.exportLiteralAttributes(outfile, level, already_processed, name_)
-        if self.hasContent_():
-            self.exportLiteralChildren(outfile, level, name_)
-    def exportLiteralAttributes(self, outfile, level, already_processed, name_):
-        if self.period is not None and 'period' not in already_processed:
-            already_processed.add('period')
-            showIndent(outfile, level)
-            outfile.write('period="%s",\n' % (self.period,))
-        super(SpikeGenerator, self).exportLiteralAttributes(outfile, level, already_processed, name_)
-    def exportLiteralChildren(self, outfile, level, name_):
-        super(SpikeGenerator, self).exportLiteralChildren(outfile, level, name_)
     def build(self, node):
         already_processed = set()
         self.buildAttributes(node, node.attrib, already_processed)
         for child in node:
             nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
             self.buildChildren(child, node, nodeName_)
+        return self
     def buildAttributes(self, node, attrs, already_processed):
         value = find_attr_value_('period', node)
         if value is not None and 'period' not in already_processed:
@@ -6421,12 +5850,18 @@ class SpikeArray(Standalone):
     subclass = None
     superclass = Standalone
     def __init__(self, id=None, neuroLexId=None, name=None, metaid=None, notes=None, annotation=None, spike=None):
+        self.original_tagname_ = None
         super(SpikeArray, self).__init__(id, neuroLexId, name, metaid, notes, annotation, )
         if spike is None:
             self.spike = []
         else:
             self.spike = spike
     def factory(*args_, **kwargs_):
+        if CurrentSubclassModule_ is not None:
+            subclass = getSubclassFromModule_(
+                CurrentSubclassModule_, SpikeArray)
+            if subclass is not None:
+                return subclass(*args_, **kwargs_)
         if SpikeArray.subclass:
             return SpikeArray.subclass(*args_, **kwargs_)
         else:
@@ -6435,7 +5870,8 @@ class SpikeArray(Standalone):
     def get_spike(self): return self.spike
     def set_spike(self, spike): self.spike = spike
     def add_spike(self, value): self.spike.append(value)
-    def insert_spike(self, index, value): self.spike[index] = value
+    def insert_spike_at(self, index, value): self.spike.insert(index, value)
+    def replace_spike_at(self, index, value): self.spike[index] = value
     def hasContent_(self):
         if (
             self.spike or
@@ -6449,13 +5885,15 @@ class SpikeArray(Standalone):
             eol_ = '\n'
         else:
             eol_ = ''
+        if self.original_tagname_ is not None:
+            name_ = self.original_tagname_
         showIndent(outfile, level, pretty_print)
         outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
         already_processed = set()
         self.exportAttributes(outfile, level, already_processed, namespace_, name_='SpikeArray')
         if self.hasContent_():
             outfile.write('>%s' % (eol_, ))
-            self.exportChildren(outfile, level + 1, namespace_, name_, pretty_print=pretty_print)
+            self.exportChildren(outfile, level + 1, namespace_='', name_='SpikeArray', pretty_print=pretty_print)
             showIndent(outfile, level, pretty_print)
             outfile.write('</%s%s>%s' % (namespace_, name_, eol_))
         else:
@@ -6470,34 +5908,13 @@ class SpikeArray(Standalone):
             eol_ = ''
         for spike_ in self.spike:
             spike_.export(outfile, level, namespace_, name_='spike', pretty_print=pretty_print)
-    def exportLiteral(self, outfile, level, name_='SpikeArray'):
-        level += 1
-        already_processed = set()
-        self.exportLiteralAttributes(outfile, level, already_processed, name_)
-        if self.hasContent_():
-            self.exportLiteralChildren(outfile, level, name_)
-    def exportLiteralAttributes(self, outfile, level, already_processed, name_):
-        super(SpikeArray, self).exportLiteralAttributes(outfile, level, already_processed, name_)
-    def exportLiteralChildren(self, outfile, level, name_):
-        super(SpikeArray, self).exportLiteralChildren(outfile, level, name_)
-        showIndent(outfile, level)
-        outfile.write('spike=[\n')
-        level += 1
-        for spike_ in self.spike:
-            showIndent(outfile, level)
-            outfile.write('model_.Spike(\n')
-            spike_.exportLiteral(outfile, level, name_='Spike')
-            showIndent(outfile, level)
-            outfile.write('),\n')
-        level -= 1
-        showIndent(outfile, level)
-        outfile.write('],\n')
     def build(self, node):
         already_processed = set()
         self.buildAttributes(node, node.attrib, already_processed)
         for child in node:
             nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
             self.buildChildren(child, node, nodeName_)
+        return self
     def buildAttributes(self, node, attrs, already_processed):
         super(SpikeArray, self).buildAttributes(node, attrs, already_processed)
     def buildChildren(self, child_, node, nodeName_, fromsubclass_=False):
@@ -6505,6 +5922,7 @@ class SpikeArray(Standalone):
             obj_ = Spike.factory()
             obj_.build(child_)
             self.spike.append(obj_)
+            obj_.original_tagname_ = 'spike'
         super(SpikeArray, self).buildChildren(child_, node, nodeName_, True)
 # end class SpikeArray
 
@@ -6513,10 +5931,15 @@ class Spike(Standalone):
     subclass = None
     superclass = Standalone
     def __init__(self, id=None, neuroLexId=None, name=None, metaid=None, notes=None, annotation=None, time=None):
+        self.original_tagname_ = None
         super(Spike, self).__init__(id, neuroLexId, name, metaid, notes, annotation, )
         self.time = _cast(None, time)
-        pass
     def factory(*args_, **kwargs_):
+        if CurrentSubclassModule_ is not None:
+            subclass = getSubclassFromModule_(
+                CurrentSubclassModule_, Spike)
+            if subclass is not None:
+                return subclass(*args_, **kwargs_)
         if Spike.subclass:
             return Spike.subclass(*args_, **kwargs_)
         else:
@@ -6526,7 +5949,11 @@ class Spike(Standalone):
     def set_time(self, time): self.time = time
     def validate_Nml2Quantity_time(self, value):
         # Validate type Nml2Quantity_time, a restriction on xs:string.
-        pass
+        if value is not None and Validate_simpletypes_:
+            if not self.gds_validate_simple_patterns(
+                    self.validate_Nml2Quantity_time_patterns_, value):
+                warnings_.warn('Value "%s" does not match xsd pattern restrictions: %s' % (value.encode('utf-8'), self.validate_Nml2Quantity_time_patterns_, ))
+    validate_Nml2Quantity_time_patterns_ = [['^-?([0-9]*(\\.[0-9]+)?)([eE]-?[0-9]+)?[\\s]*(s|ms)$']]
     def hasContent_(self):
         if (
             super(Spike, self).hasContent_()
@@ -6539,13 +5966,15 @@ class Spike(Standalone):
             eol_ = '\n'
         else:
             eol_ = ''
+        if self.original_tagname_ is not None:
+            name_ = self.original_tagname_
         showIndent(outfile, level, pretty_print)
         outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
         already_processed = set()
         self.exportAttributes(outfile, level, already_processed, namespace_, name_='Spike')
         if self.hasContent_():
             outfile.write('>%s' % (eol_, ))
-            self.exportChildren(outfile, level + 1, namespace_, name_, pretty_print=pretty_print)
+            self.exportChildren(outfile, level + 1, namespace_='', name_='Spike', pretty_print=pretty_print)
             showIndent(outfile, level, pretty_print)
             outfile.write('</%s%s>%s' % (namespace_, name_, eol_))
         else:
@@ -6557,26 +5986,13 @@ class Spike(Standalone):
             outfile.write(' time=%s' % (quote_attrib(self.time), ))
     def exportChildren(self, outfile, level, namespace_='', name_='Spike', fromsubclass_=False, pretty_print=True):
         super(Spike, self).exportChildren(outfile, level, namespace_, name_, True, pretty_print=pretty_print)
-    def exportLiteral(self, outfile, level, name_='Spike'):
-        level += 1
-        already_processed = set()
-        self.exportLiteralAttributes(outfile, level, already_processed, name_)
-        if self.hasContent_():
-            self.exportLiteralChildren(outfile, level, name_)
-    def exportLiteralAttributes(self, outfile, level, already_processed, name_):
-        if self.time is not None and 'time' not in already_processed:
-            already_processed.add('time')
-            showIndent(outfile, level)
-            outfile.write('time="%s",\n' % (self.time,))
-        super(Spike, self).exportLiteralAttributes(outfile, level, already_processed, name_)
-    def exportLiteralChildren(self, outfile, level, name_):
-        super(Spike, self).exportLiteralChildren(outfile, level, name_)
     def build(self, node):
         already_processed = set()
         self.buildAttributes(node, node.attrib, already_processed)
         for child in node:
             nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
             self.buildChildren(child, node, nodeName_)
+        return self
     def buildAttributes(self, node, attrs, already_processed):
         value = find_attr_value_('time', node)
         if value is not None and 'time' not in already_processed:
@@ -6593,14 +6009,19 @@ class Spike(Standalone):
 class VoltageClamp(Standalone):
     subclass = None
     superclass = Standalone
-    def __init__(self, id=None, neuroLexId=None, name=None, metaid=None, notes=None, annotation=None, delay=None, duration=None, seriesResistance=None, targetVoltage=None):
+    def __init__(self, id=None, neuroLexId=None, name=None, metaid=None, notes=None, annotation=None, delay=None, duration=None, targetVoltage=None, seriesResistance=None):
+        self.original_tagname_ = None
         super(VoltageClamp, self).__init__(id, neuroLexId, name, metaid, notes, annotation, )
         self.delay = _cast(None, delay)
         self.duration = _cast(None, duration)
-        self.seriesResistance = _cast(None, seriesResistance)
         self.targetVoltage = _cast(None, targetVoltage)
-        pass
+        self.seriesResistance = _cast(None, seriesResistance)
     def factory(*args_, **kwargs_):
+        if CurrentSubclassModule_ is not None:
+            subclass = getSubclassFromModule_(
+                CurrentSubclassModule_, VoltageClamp)
+            if subclass is not None:
+                return subclass(*args_, **kwargs_)
         if VoltageClamp.subclass:
             return VoltageClamp.subclass(*args_, **kwargs_)
         else:
@@ -6608,21 +6029,33 @@ class VoltageClamp(Standalone):
     factory = staticmethod(factory)
     def get_delay(self): return self.delay
     def set_delay(self, delay): self.delay = delay
-    def validate_Nml2Quantity_time(self, value):
-        # Validate type Nml2Quantity_time, a restriction on xs:string.
-        pass
     def get_duration(self): return self.duration
     def set_duration(self, duration): self.duration = duration
-    def get_seriesResistance(self): return self.seriesResistance
-    def set_seriesResistance(self, seriesResistance): self.seriesResistance = seriesResistance
-    def validate_Nml2Quantity_resistance(self, value):
-        # Validate type Nml2Quantity_resistance, a restriction on xs:string.
-        pass
     def get_targetVoltage(self): return self.targetVoltage
     def set_targetVoltage(self, targetVoltage): self.targetVoltage = targetVoltage
+    def get_seriesResistance(self): return self.seriesResistance
+    def set_seriesResistance(self, seriesResistance): self.seriesResistance = seriesResistance
+    def validate_Nml2Quantity_time(self, value):
+        # Validate type Nml2Quantity_time, a restriction on xs:string.
+        if value is not None and Validate_simpletypes_:
+            if not self.gds_validate_simple_patterns(
+                    self.validate_Nml2Quantity_time_patterns_, value):
+                warnings_.warn('Value "%s" does not match xsd pattern restrictions: %s' % (value.encode('utf-8'), self.validate_Nml2Quantity_time_patterns_, ))
+    validate_Nml2Quantity_time_patterns_ = [['^-?([0-9]*(\\.[0-9]+)?)([eE]-?[0-9]+)?[\\s]*(s|ms)$']]
     def validate_Nml2Quantity_voltage(self, value):
         # Validate type Nml2Quantity_voltage, a restriction on xs:string.
-        pass
+        if value is not None and Validate_simpletypes_:
+            if not self.gds_validate_simple_patterns(
+                    self.validate_Nml2Quantity_voltage_patterns_, value):
+                warnings_.warn('Value "%s" does not match xsd pattern restrictions: %s' % (value.encode('utf-8'), self.validate_Nml2Quantity_voltage_patterns_, ))
+    validate_Nml2Quantity_voltage_patterns_ = [['^-?([0-9]*(\\.[0-9]+)?)([eE]-?[0-9]+)?[\\s]*(V|mV)$']]
+    def validate_Nml2Quantity_resistance(self, value):
+        # Validate type Nml2Quantity_resistance, a restriction on xs:string.
+        if value is not None and Validate_simpletypes_:
+            if not self.gds_validate_simple_patterns(
+                    self.validate_Nml2Quantity_resistance_patterns_, value):
+                warnings_.warn('Value "%s" does not match xsd pattern restrictions: %s' % (value.encode('utf-8'), self.validate_Nml2Quantity_resistance_patterns_, ))
+    validate_Nml2Quantity_resistance_patterns_ = [['^-?([0-9]*(\\.[0-9]+)?)([eE]-?[0-9]+)?[\\s]*(ohm|Kohm|Mohm)$']]
     def hasContent_(self):
         if (
             super(VoltageClamp, self).hasContent_()
@@ -6635,13 +6068,15 @@ class VoltageClamp(Standalone):
             eol_ = '\n'
         else:
             eol_ = ''
+        if self.original_tagname_ is not None:
+            name_ = self.original_tagname_
         showIndent(outfile, level, pretty_print)
         outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
         already_processed = set()
         self.exportAttributes(outfile, level, already_processed, namespace_, name_='VoltageClamp')
         if self.hasContent_():
             outfile.write('>%s' % (eol_, ))
-            self.exportChildren(outfile, level + 1, namespace_, name_, pretty_print=pretty_print)
+            self.exportChildren(outfile, level + 1, namespace_='', name_='VoltageClamp', pretty_print=pretty_print)
             showIndent(outfile, level, pretty_print)
             outfile.write('</%s%s>%s' % (namespace_, name_, eol_))
         else:
@@ -6654,46 +6089,21 @@ class VoltageClamp(Standalone):
         if self.duration is not None and 'duration' not in already_processed:
             already_processed.add('duration')
             outfile.write(' duration=%s' % (quote_attrib(self.duration), ))
-        if self.seriesResistance is not None and 'seriesResistance' not in already_processed:
-            already_processed.add('seriesResistance')
-            outfile.write(' seriesResistance=%s' % (quote_attrib(self.seriesResistance), ))
         if self.targetVoltage is not None and 'targetVoltage' not in already_processed:
             already_processed.add('targetVoltage')
             outfile.write(' targetVoltage=%s' % (quote_attrib(self.targetVoltage), ))
-    def exportChildren(self, outfile, level, namespace_='', name_='VoltageClamp', fromsubclass_=False, pretty_print=True):
-        super(VoltageClamp, self).exportChildren(outfile, level, namespace_, name_, True, pretty_print=pretty_print)
-    def exportLiteral(self, outfile, level, name_='VoltageClamp'):
-        level += 1
-        already_processed = set()
-        self.exportLiteralAttributes(outfile, level, already_processed, name_)
-        if self.hasContent_():
-            self.exportLiteralChildren(outfile, level, name_)
-    def exportLiteralAttributes(self, outfile, level, already_processed, name_):
-        if self.delay is not None and 'delay' not in already_processed:
-            already_processed.add('delay')
-            showIndent(outfile, level)
-            outfile.write('delay="%s",\n' % (self.delay,))
-        if self.duration is not None and 'duration' not in already_processed:
-            already_processed.add('duration')
-            showIndent(outfile, level)
-            outfile.write('duration="%s",\n' % (self.duration,))
         if self.seriesResistance is not None and 'seriesResistance' not in already_processed:
             already_processed.add('seriesResistance')
-            showIndent(outfile, level)
-            outfile.write('seriesResistance="%s",\n' % (self.seriesResistance,))
-        if self.targetVoltage is not None and 'targetVoltage' not in already_processed:
-            already_processed.add('targetVoltage')
-            showIndent(outfile, level)
-            outfile.write('targetVoltage="%s",\n' % (self.targetVoltage,))
-        super(VoltageClamp, self).exportLiteralAttributes(outfile, level, already_processed, name_)
-    def exportLiteralChildren(self, outfile, level, name_):
-        super(VoltageClamp, self).exportLiteralChildren(outfile, level, name_)
+            outfile.write(' seriesResistance=%s' % (quote_attrib(self.seriesResistance), ))
+    def exportChildren(self, outfile, level, namespace_='', name_='VoltageClamp', fromsubclass_=False, pretty_print=True):
+        super(VoltageClamp, self).exportChildren(outfile, level, namespace_, name_, True, pretty_print=pretty_print)
     def build(self, node):
         already_processed = set()
         self.buildAttributes(node, node.attrib, already_processed)
         for child in node:
             nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
             self.buildChildren(child, node, nodeName_)
+        return self
     def buildAttributes(self, node, attrs, already_processed):
         value = find_attr_value_('delay', node)
         if value is not None and 'delay' not in already_processed:
@@ -6705,16 +6115,16 @@ class VoltageClamp(Standalone):
             already_processed.add('duration')
             self.duration = value
             self.validate_Nml2Quantity_time(self.duration)    # validate type Nml2Quantity_time
-        value = find_attr_value_('seriesResistance', node)
-        if value is not None and 'seriesResistance' not in already_processed:
-            already_processed.add('seriesResistance')
-            self.seriesResistance = value
-            self.validate_Nml2Quantity_resistance(self.seriesResistance)    # validate type Nml2Quantity_resistance
         value = find_attr_value_('targetVoltage', node)
         if value is not None and 'targetVoltage' not in already_processed:
             already_processed.add('targetVoltage')
             self.targetVoltage = value
             self.validate_Nml2Quantity_voltage(self.targetVoltage)    # validate type Nml2Quantity_voltage
+        value = find_attr_value_('seriesResistance', node)
+        if value is not None and 'seriesResistance' not in already_processed:
+            already_processed.add('seriesResistance')
+            self.seriesResistance = value
+            self.validate_Nml2Quantity_resistance(self.seriesResistance)    # validate type Nml2Quantity_resistance
         super(VoltageClamp, self).buildAttributes(node, attrs, already_processed)
     def buildChildren(self, child_, node, nodeName_, fromsubclass_=False):
         super(VoltageClamp, self).buildChildren(child_, node, nodeName_, True)
@@ -6725,15 +6135,20 @@ class VoltageClamp(Standalone):
 class RampGenerator(Standalone):
     subclass = None
     superclass = Standalone
-    def __init__(self, id=None, neuroLexId=None, name=None, metaid=None, notes=None, annotation=None, delay=None, duration=None, baselineAmplitude=None, startAmplitude=None, finishAmplitude=None):
+    def __init__(self, id=None, neuroLexId=None, name=None, metaid=None, notes=None, annotation=None, delay=None, duration=None, startAmplitude=None, finishAmplitude=None, baselineAmplitude=None):
+        self.original_tagname_ = None
         super(RampGenerator, self).__init__(id, neuroLexId, name, metaid, notes, annotation, )
         self.delay = _cast(None, delay)
         self.duration = _cast(None, duration)
-        self.baselineAmplitude = _cast(None, baselineAmplitude)
         self.startAmplitude = _cast(None, startAmplitude)
         self.finishAmplitude = _cast(None, finishAmplitude)
-        pass
+        self.baselineAmplitude = _cast(None, baselineAmplitude)
     def factory(*args_, **kwargs_):
+        if CurrentSubclassModule_ is not None:
+            subclass = getSubclassFromModule_(
+                CurrentSubclassModule_, RampGenerator)
+            if subclass is not None:
+                return subclass(*args_, **kwargs_)
         if RampGenerator.subclass:
             return RampGenerator.subclass(*args_, **kwargs_)
         else:
@@ -6741,20 +6156,28 @@ class RampGenerator(Standalone):
     factory = staticmethod(factory)
     def get_delay(self): return self.delay
     def set_delay(self, delay): self.delay = delay
-    def validate_Nml2Quantity_time(self, value):
-        # Validate type Nml2Quantity_time, a restriction on xs:string.
-        pass
     def get_duration(self): return self.duration
     def set_duration(self, duration): self.duration = duration
-    def get_baselineAmplitude(self): return self.baselineAmplitude
-    def set_baselineAmplitude(self, baselineAmplitude): self.baselineAmplitude = baselineAmplitude
-    def validate_Nml2Quantity_current(self, value):
-        # Validate type Nml2Quantity_current, a restriction on xs:string.
-        pass
     def get_startAmplitude(self): return self.startAmplitude
     def set_startAmplitude(self, startAmplitude): self.startAmplitude = startAmplitude
     def get_finishAmplitude(self): return self.finishAmplitude
     def set_finishAmplitude(self, finishAmplitude): self.finishAmplitude = finishAmplitude
+    def get_baselineAmplitude(self): return self.baselineAmplitude
+    def set_baselineAmplitude(self, baselineAmplitude): self.baselineAmplitude = baselineAmplitude
+    def validate_Nml2Quantity_time(self, value):
+        # Validate type Nml2Quantity_time, a restriction on xs:string.
+        if value is not None and Validate_simpletypes_:
+            if not self.gds_validate_simple_patterns(
+                    self.validate_Nml2Quantity_time_patterns_, value):
+                warnings_.warn('Value "%s" does not match xsd pattern restrictions: %s' % (value.encode('utf-8'), self.validate_Nml2Quantity_time_patterns_, ))
+    validate_Nml2Quantity_time_patterns_ = [['^-?([0-9]*(\\.[0-9]+)?)([eE]-?[0-9]+)?[\\s]*(s|ms)$']]
+    def validate_Nml2Quantity_current(self, value):
+        # Validate type Nml2Quantity_current, a restriction on xs:string.
+        if value is not None and Validate_simpletypes_:
+            if not self.gds_validate_simple_patterns(
+                    self.validate_Nml2Quantity_current_patterns_, value):
+                warnings_.warn('Value "%s" does not match xsd pattern restrictions: %s' % (value.encode('utf-8'), self.validate_Nml2Quantity_current_patterns_, ))
+    validate_Nml2Quantity_current_patterns_ = [['^-?([0-9]*(\\.[0-9]+)?)([eE]-?[0-9]+)?[\\s]*(A|uA|nA|pA)$']]
     def hasContent_(self):
         if (
             super(RampGenerator, self).hasContent_()
@@ -6767,13 +6190,15 @@ class RampGenerator(Standalone):
             eol_ = '\n'
         else:
             eol_ = ''
+        if self.original_tagname_ is not None:
+            name_ = self.original_tagname_
         showIndent(outfile, level, pretty_print)
         outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
         already_processed = set()
         self.exportAttributes(outfile, level, already_processed, namespace_, name_='RampGenerator')
         if self.hasContent_():
             outfile.write('>%s' % (eol_, ))
-            self.exportChildren(outfile, level + 1, namespace_, name_, pretty_print=pretty_print)
+            self.exportChildren(outfile, level + 1, namespace_='', name_='RampGenerator', pretty_print=pretty_print)
             showIndent(outfile, level, pretty_print)
             outfile.write('</%s%s>%s' % (namespace_, name_, eol_))
         else:
@@ -6786,53 +6211,24 @@ class RampGenerator(Standalone):
         if self.duration is not None and 'duration' not in already_processed:
             already_processed.add('duration')
             outfile.write(' duration=%s' % (quote_attrib(self.duration), ))
-        if self.baselineAmplitude is not None and 'baselineAmplitude' not in already_processed:
-            already_processed.add('baselineAmplitude')
-            outfile.write(' baselineAmplitude=%s' % (quote_attrib(self.baselineAmplitude), ))
         if self.startAmplitude is not None and 'startAmplitude' not in already_processed:
             already_processed.add('startAmplitude')
             outfile.write(' startAmplitude=%s' % (quote_attrib(self.startAmplitude), ))
         if self.finishAmplitude is not None and 'finishAmplitude' not in already_processed:
             already_processed.add('finishAmplitude')
             outfile.write(' finishAmplitude=%s' % (quote_attrib(self.finishAmplitude), ))
-    def exportChildren(self, outfile, level, namespace_='', name_='RampGenerator', fromsubclass_=False, pretty_print=True):
-        super(RampGenerator, self).exportChildren(outfile, level, namespace_, name_, True, pretty_print=pretty_print)
-    def exportLiteral(self, outfile, level, name_='RampGenerator'):
-        level += 1
-        already_processed = set()
-        self.exportLiteralAttributes(outfile, level, already_processed, name_)
-        if self.hasContent_():
-            self.exportLiteralChildren(outfile, level, name_)
-    def exportLiteralAttributes(self, outfile, level, already_processed, name_):
-        if self.delay is not None and 'delay' not in already_processed:
-            already_processed.add('delay')
-            showIndent(outfile, level)
-            outfile.write('delay="%s",\n' % (self.delay,))
-        if self.duration is not None and 'duration' not in already_processed:
-            already_processed.add('duration')
-            showIndent(outfile, level)
-            outfile.write('duration="%s",\n' % (self.duration,))
         if self.baselineAmplitude is not None and 'baselineAmplitude' not in already_processed:
             already_processed.add('baselineAmplitude')
-            showIndent(outfile, level)
-            outfile.write('baselineAmplitude="%s",\n' % (self.baselineAmplitude,))
-        if self.startAmplitude is not None and 'startAmplitude' not in already_processed:
-            already_processed.add('startAmplitude')
-            showIndent(outfile, level)
-            outfile.write('startAmplitude="%s",\n' % (self.startAmplitude,))
-        if self.finishAmplitude is not None and 'finishAmplitude' not in already_processed:
-            already_processed.add('finishAmplitude')
-            showIndent(outfile, level)
-            outfile.write('finishAmplitude="%s",\n' % (self.finishAmplitude,))
-        super(RampGenerator, self).exportLiteralAttributes(outfile, level, already_processed, name_)
-    def exportLiteralChildren(self, outfile, level, name_):
-        super(RampGenerator, self).exportLiteralChildren(outfile, level, name_)
+            outfile.write(' baselineAmplitude=%s' % (quote_attrib(self.baselineAmplitude), ))
+    def exportChildren(self, outfile, level, namespace_='', name_='RampGenerator', fromsubclass_=False, pretty_print=True):
+        super(RampGenerator, self).exportChildren(outfile, level, namespace_, name_, True, pretty_print=pretty_print)
     def build(self, node):
         already_processed = set()
         self.buildAttributes(node, node.attrib, already_processed)
         for child in node:
             nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
             self.buildChildren(child, node, nodeName_)
+        return self
     def buildAttributes(self, node, attrs, already_processed):
         value = find_attr_value_('delay', node)
         if value is not None and 'delay' not in already_processed:
@@ -6844,11 +6240,6 @@ class RampGenerator(Standalone):
             already_processed.add('duration')
             self.duration = value
             self.validate_Nml2Quantity_time(self.duration)    # validate type Nml2Quantity_time
-        value = find_attr_value_('baselineAmplitude', node)
-        if value is not None and 'baselineAmplitude' not in already_processed:
-            already_processed.add('baselineAmplitude')
-            self.baselineAmplitude = value
-            self.validate_Nml2Quantity_current(self.baselineAmplitude)    # validate type Nml2Quantity_current
         value = find_attr_value_('startAmplitude', node)
         if value is not None and 'startAmplitude' not in already_processed:
             already_processed.add('startAmplitude')
@@ -6859,6 +6250,11 @@ class RampGenerator(Standalone):
             already_processed.add('finishAmplitude')
             self.finishAmplitude = value
             self.validate_Nml2Quantity_current(self.finishAmplitude)    # validate type Nml2Quantity_current
+        value = find_attr_value_('baselineAmplitude', node)
+        if value is not None and 'baselineAmplitude' not in already_processed:
+            already_processed.add('baselineAmplitude')
+            self.baselineAmplitude = value
+            self.validate_Nml2Quantity_current(self.baselineAmplitude)    # validate type Nml2Quantity_current
         super(RampGenerator, self).buildAttributes(node, attrs, already_processed)
     def buildChildren(self, child_, node, nodeName_, fromsubclass_=False):
         super(RampGenerator, self).buildChildren(child_, node, nodeName_, True)
@@ -6869,15 +6265,20 @@ class RampGenerator(Standalone):
 class SineGenerator(Standalone):
     subclass = None
     superclass = Standalone
-    def __init__(self, id=None, neuroLexId=None, name=None, metaid=None, notes=None, annotation=None, delay=None, phase=None, duration=None, period=None, amplitude=None):
+    def __init__(self, id=None, neuroLexId=None, name=None, metaid=None, notes=None, annotation=None, delay=None, phase=None, duration=None, amplitude=None, period=None):
+        self.original_tagname_ = None
         super(SineGenerator, self).__init__(id, neuroLexId, name, metaid, notes, annotation, )
         self.delay = _cast(None, delay)
         self.phase = _cast(None, phase)
         self.duration = _cast(None, duration)
-        self.period = _cast(None, period)
         self.amplitude = _cast(None, amplitude)
-        pass
+        self.period = _cast(None, period)
     def factory(*args_, **kwargs_):
+        if CurrentSubclassModule_ is not None:
+            subclass = getSubclassFromModule_(
+                CurrentSubclassModule_, SineGenerator)
+            if subclass is not None:
+                return subclass(*args_, **kwargs_)
         if SineGenerator.subclass:
             return SineGenerator.subclass(*args_, **kwargs_)
         else:
@@ -6885,23 +6286,35 @@ class SineGenerator(Standalone):
     factory = staticmethod(factory)
     def get_delay(self): return self.delay
     def set_delay(self, delay): self.delay = delay
-    def validate_Nml2Quantity_time(self, value):
-        # Validate type Nml2Quantity_time, a restriction on xs:string.
-        pass
     def get_phase(self): return self.phase
     def set_phase(self, phase): self.phase = phase
-    def validate_Nml2Quantity_none(self, value):
-        # Validate type Nml2Quantity_none, a restriction on xs:string.
-        pass
     def get_duration(self): return self.duration
     def set_duration(self, duration): self.duration = duration
-    def get_period(self): return self.period
-    def set_period(self, period): self.period = period
     def get_amplitude(self): return self.amplitude
     def set_amplitude(self, amplitude): self.amplitude = amplitude
+    def get_period(self): return self.period
+    def set_period(self, period): self.period = period
+    def validate_Nml2Quantity_time(self, value):
+        # Validate type Nml2Quantity_time, a restriction on xs:string.
+        if value is not None and Validate_simpletypes_:
+            if not self.gds_validate_simple_patterns(
+                    self.validate_Nml2Quantity_time_patterns_, value):
+                warnings_.warn('Value "%s" does not match xsd pattern restrictions: %s' % (value.encode('utf-8'), self.validate_Nml2Quantity_time_patterns_, ))
+    validate_Nml2Quantity_time_patterns_ = [['^-?([0-9]*(\\.[0-9]+)?)([eE]-?[0-9]+)?[\\s]*(s|ms)$']]
+    def validate_Nml2Quantity_none(self, value):
+        # Validate type Nml2Quantity_none, a restriction on xs:string.
+        if value is not None and Validate_simpletypes_:
+            if not self.gds_validate_simple_patterns(
+                    self.validate_Nml2Quantity_none_patterns_, value):
+                warnings_.warn('Value "%s" does not match xsd pattern restrictions: %s' % (value.encode('utf-8'), self.validate_Nml2Quantity_none_patterns_, ))
+    validate_Nml2Quantity_none_patterns_ = [['^-?([0-9]*(\\.[0-9]+)?)([eE]-?[0-9]+)?$']]
     def validate_Nml2Quantity_current(self, value):
         # Validate type Nml2Quantity_current, a restriction on xs:string.
-        pass
+        if value is not None and Validate_simpletypes_:
+            if not self.gds_validate_simple_patterns(
+                    self.validate_Nml2Quantity_current_patterns_, value):
+                warnings_.warn('Value "%s" does not match xsd pattern restrictions: %s' % (value.encode('utf-8'), self.validate_Nml2Quantity_current_patterns_, ))
+    validate_Nml2Quantity_current_patterns_ = [['^-?([0-9]*(\\.[0-9]+)?)([eE]-?[0-9]+)?[\\s]*(A|uA|nA|pA)$']]
     def hasContent_(self):
         if (
             super(SineGenerator, self).hasContent_()
@@ -6914,13 +6327,15 @@ class SineGenerator(Standalone):
             eol_ = '\n'
         else:
             eol_ = ''
+        if self.original_tagname_ is not None:
+            name_ = self.original_tagname_
         showIndent(outfile, level, pretty_print)
         outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
         already_processed = set()
         self.exportAttributes(outfile, level, already_processed, namespace_, name_='SineGenerator')
         if self.hasContent_():
             outfile.write('>%s' % (eol_, ))
-            self.exportChildren(outfile, level + 1, namespace_, name_, pretty_print=pretty_print)
+            self.exportChildren(outfile, level + 1, namespace_='', name_='SineGenerator', pretty_print=pretty_print)
             showIndent(outfile, level, pretty_print)
             outfile.write('</%s%s>%s' % (namespace_, name_, eol_))
         else:
@@ -6936,50 +6351,21 @@ class SineGenerator(Standalone):
         if self.duration is not None and 'duration' not in already_processed:
             already_processed.add('duration')
             outfile.write(' duration=%s' % (quote_attrib(self.duration), ))
-        if self.period is not None and 'period' not in already_processed:
-            already_processed.add('period')
-            outfile.write(' period=%s' % (quote_attrib(self.period), ))
         if self.amplitude is not None and 'amplitude' not in already_processed:
             already_processed.add('amplitude')
             outfile.write(' amplitude=%s' % (quote_attrib(self.amplitude), ))
+        if self.period is not None and 'period' not in already_processed:
+            already_processed.add('period')
+            outfile.write(' period=%s' % (quote_attrib(self.period), ))
     def exportChildren(self, outfile, level, namespace_='', name_='SineGenerator', fromsubclass_=False, pretty_print=True):
         super(SineGenerator, self).exportChildren(outfile, level, namespace_, name_, True, pretty_print=pretty_print)
-    def exportLiteral(self, outfile, level, name_='SineGenerator'):
-        level += 1
-        already_processed = set()
-        self.exportLiteralAttributes(outfile, level, already_processed, name_)
-        if self.hasContent_():
-            self.exportLiteralChildren(outfile, level, name_)
-    def exportLiteralAttributes(self, outfile, level, already_processed, name_):
-        if self.delay is not None and 'delay' not in already_processed:
-            already_processed.add('delay')
-            showIndent(outfile, level)
-            outfile.write('delay="%s",\n' % (self.delay,))
-        if self.phase is not None and 'phase' not in already_processed:
-            already_processed.add('phase')
-            showIndent(outfile, level)
-            outfile.write('phase="%s",\n' % (self.phase,))
-        if self.duration is not None and 'duration' not in already_processed:
-            already_processed.add('duration')
-            showIndent(outfile, level)
-            outfile.write('duration="%s",\n' % (self.duration,))
-        if self.period is not None and 'period' not in already_processed:
-            already_processed.add('period')
-            showIndent(outfile, level)
-            outfile.write('period="%s",\n' % (self.period,))
-        if self.amplitude is not None and 'amplitude' not in already_processed:
-            already_processed.add('amplitude')
-            showIndent(outfile, level)
-            outfile.write('amplitude="%s",\n' % (self.amplitude,))
-        super(SineGenerator, self).exportLiteralAttributes(outfile, level, already_processed, name_)
-    def exportLiteralChildren(self, outfile, level, name_):
-        super(SineGenerator, self).exportLiteralChildren(outfile, level, name_)
-    def build(self, node):
+    def build(self, node):
         already_processed = set()
         self.buildAttributes(node, node.attrib, already_processed)
         for child in node:
             nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
             self.buildChildren(child, node, nodeName_)
+        return self
     def buildAttributes(self, node, attrs, already_processed):
         value = find_attr_value_('delay', node)
         if value is not None and 'delay' not in already_processed:
@@ -6996,16 +6382,16 @@ class SineGenerator(Standalone):
             already_processed.add('duration')
             self.duration = value
             self.validate_Nml2Quantity_time(self.duration)    # validate type Nml2Quantity_time
-        value = find_attr_value_('period', node)
-        if value is not None and 'period' not in already_processed:
-            already_processed.add('period')
-            self.period = value
-            self.validate_Nml2Quantity_time(self.period)    # validate type Nml2Quantity_time
         value = find_attr_value_('amplitude', node)
         if value is not None and 'amplitude' not in already_processed:
             already_processed.add('amplitude')
             self.amplitude = value
             self.validate_Nml2Quantity_current(self.amplitude)    # validate type Nml2Quantity_current
+        value = find_attr_value_('period', node)
+        if value is not None and 'period' not in already_processed:
+            already_processed.add('period')
+            self.period = value
+            self.validate_Nml2Quantity_time(self.period)    # validate type Nml2Quantity_time
         super(SineGenerator, self).buildAttributes(node, attrs, already_processed)
     def buildChildren(self, child_, node, nodeName_, fromsubclass_=False):
         super(SineGenerator, self).buildChildren(child_, node, nodeName_, True)
@@ -7017,12 +6403,17 @@ class PulseGenerator(Standalone):
     subclass = None
     superclass = Standalone
     def __init__(self, id=None, neuroLexId=None, name=None, metaid=None, notes=None, annotation=None, delay=None, duration=None, amplitude=None):
+        self.original_tagname_ = None
         super(PulseGenerator, self).__init__(id, neuroLexId, name, metaid, notes, annotation, )
         self.delay = _cast(None, delay)
         self.duration = _cast(None, duration)
         self.amplitude = _cast(None, amplitude)
-        pass
     def factory(*args_, **kwargs_):
+        if CurrentSubclassModule_ is not None:
+            subclass = getSubclassFromModule_(
+                CurrentSubclassModule_, PulseGenerator)
+            if subclass is not None:
+                return subclass(*args_, **kwargs_)
         if PulseGenerator.subclass:
             return PulseGenerator.subclass(*args_, **kwargs_)
         else:
@@ -7030,16 +6421,24 @@ class PulseGenerator(Standalone):
     factory = staticmethod(factory)
     def get_delay(self): return self.delay
     def set_delay(self, delay): self.delay = delay
-    def validate_Nml2Quantity_time(self, value):
-        # Validate type Nml2Quantity_time, a restriction on xs:string.
-        pass
     def get_duration(self): return self.duration
     def set_duration(self, duration): self.duration = duration
     def get_amplitude(self): return self.amplitude
     def set_amplitude(self, amplitude): self.amplitude = amplitude
+    def validate_Nml2Quantity_time(self, value):
+        # Validate type Nml2Quantity_time, a restriction on xs:string.
+        if value is not None and Validate_simpletypes_:
+            if not self.gds_validate_simple_patterns(
+                    self.validate_Nml2Quantity_time_patterns_, value):
+                warnings_.warn('Value "%s" does not match xsd pattern restrictions: %s' % (value.encode('utf-8'), self.validate_Nml2Quantity_time_patterns_, ))
+    validate_Nml2Quantity_time_patterns_ = [['^-?([0-9]*(\\.[0-9]+)?)([eE]-?[0-9]+)?[\\s]*(s|ms)$']]
     def validate_Nml2Quantity_current(self, value):
         # Validate type Nml2Quantity_current, a restriction on xs:string.
-        pass
+        if value is not None and Validate_simpletypes_:
+            if not self.gds_validate_simple_patterns(
+                    self.validate_Nml2Quantity_current_patterns_, value):
+                warnings_.warn('Value "%s" does not match xsd pattern restrictions: %s' % (value.encode('utf-8'), self.validate_Nml2Quantity_current_patterns_, ))
+    validate_Nml2Quantity_current_patterns_ = [['^-?([0-9]*(\\.[0-9]+)?)([eE]-?[0-9]+)?[\\s]*(A|uA|nA|pA)$']]
     def hasContent_(self):
         if (
             super(PulseGenerator, self).hasContent_()
@@ -7052,13 +6451,15 @@ class PulseGenerator(Standalone):
             eol_ = '\n'
         else:
             eol_ = ''
+        if self.original_tagname_ is not None:
+            name_ = self.original_tagname_
         showIndent(outfile, level, pretty_print)
         outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
         already_processed = set()
         self.exportAttributes(outfile, level, already_processed, namespace_, name_='PulseGenerator')
         if self.hasContent_():
             outfile.write('>%s' % (eol_, ))
-            self.exportChildren(outfile, level + 1, namespace_, name_, pretty_print=pretty_print)
+            self.exportChildren(outfile, level + 1, namespace_='', name_='PulseGenerator', pretty_print=pretty_print)
             showIndent(outfile, level, pretty_print)
             outfile.write('</%s%s>%s' % (namespace_, name_, eol_))
         else:
@@ -7076,34 +6477,13 @@ class PulseGenerator(Standalone):
             outfile.write(' amplitude=%s' % (quote_attrib(self.amplitude), ))
     def exportChildren(self, outfile, level, namespace_='', name_='PulseGenerator', fromsubclass_=False, pretty_print=True):
         super(PulseGenerator, self).exportChildren(outfile, level, namespace_, name_, True, pretty_print=pretty_print)
-    def exportLiteral(self, outfile, level, name_='PulseGenerator'):
-        level += 1
-        already_processed = set()
-        self.exportLiteralAttributes(outfile, level, already_processed, name_)
-        if self.hasContent_():
-            self.exportLiteralChildren(outfile, level, name_)
-    def exportLiteralAttributes(self, outfile, level, already_processed, name_):
-        if self.delay is not None and 'delay' not in already_processed:
-            already_processed.add('delay')
-            showIndent(outfile, level)
-            outfile.write('delay="%s",\n' % (self.delay,))
-        if self.duration is not None and 'duration' not in already_processed:
-            already_processed.add('duration')
-            showIndent(outfile, level)
-            outfile.write('duration="%s",\n' % (self.duration,))
-        if self.amplitude is not None and 'amplitude' not in already_processed:
-            already_processed.add('amplitude')
-            showIndent(outfile, level)
-            outfile.write('amplitude="%s",\n' % (self.amplitude,))
-        super(PulseGenerator, self).exportLiteralAttributes(outfile, level, already_processed, name_)
-    def exportLiteralChildren(self, outfile, level, name_):
-        super(PulseGenerator, self).exportLiteralChildren(outfile, level, name_)
     def build(self, node):
         already_processed = set()
         self.buildAttributes(node, node.attrib, already_processed)
         for child in node:
             nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
             self.buildChildren(child, node, nodeName_)
+        return self
     def buildAttributes(self, node, attrs, already_processed):
         value = find_attr_value_('delay', node)
         if value is not None and 'delay' not in already_processed:
@@ -7131,6 +6511,7 @@ class ReactionScheme(Base):
     subclass = None
     superclass = Base
     def __init__(self, id=None, neuroLexId=None, source=None, type_=None, anytypeobjs_=None):
+        self.original_tagname_ = None
         super(ReactionScheme, self).__init__(id, neuroLexId, )
         self.source = _cast(None, source)
         self.type_ = _cast(None, type_)
@@ -7139,6 +6520,11 @@ class ReactionScheme(Base):
         else:
             self.anytypeobjs_ = anytypeobjs_
     def factory(*args_, **kwargs_):
+        if CurrentSubclassModule_ is not None:
+            subclass = getSubclassFromModule_(
+                CurrentSubclassModule_, ReactionScheme)
+            if subclass is not None:
+                return subclass(*args_, **kwargs_)
         if ReactionScheme.subclass:
             return ReactionScheme.subclass(*args_, **kwargs_)
         else:
@@ -7165,13 +6551,15 @@ class ReactionScheme(Base):
             eol_ = '\n'
         else:
             eol_ = ''
+        if self.original_tagname_ is not None:
+            name_ = self.original_tagname_
         showIndent(outfile, level, pretty_print)
         outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
         already_processed = set()
         self.exportAttributes(outfile, level, already_processed, namespace_, name_='ReactionScheme')
         if self.hasContent_():
             outfile.write('>%s' % (eol_, ))
-            self.exportChildren(outfile, level + 1, namespace_, name_, pretty_print=pretty_print)
+            self.exportChildren(outfile, level + 1, namespace_='', name_='ReactionScheme', pretty_print=pretty_print)
             showIndent(outfile, level, pretty_print)
             outfile.write('</%s%s>%s' % (namespace_, name_, eol_))
         else:
@@ -7180,10 +6568,10 @@ class ReactionScheme(Base):
         super(ReactionScheme, self).exportAttributes(outfile, level, already_processed, namespace_, name_='ReactionScheme')
         if self.source is not None and 'source' not in already_processed:
             already_processed.add('source')
-            outfile.write(' source=%s' % (self.gds_format_string(quote_attrib(self.source).encode(ExternalEncoding), input_name='source'), ))
+            outfile.write(' source=%s' % (self.gds_encode(self.gds_format_string(quote_attrib(self.source), input_name='source')), ))
         if self.type_ is not None and 'type_' not in already_processed:
             already_processed.add('type_')
-            outfile.write(' type=%s' % (self.gds_format_string(quote_attrib(self.type_).encode(ExternalEncoding), input_name='type'), ))
+            outfile.write(' type=%s' % (self.gds_encode(self.gds_format_string(quote_attrib(self.type_), input_name='type')), ))
     def exportChildren(self, outfile, level, namespace_='', name_='ReactionScheme', fromsubclass_=False, pretty_print=True):
         super(ReactionScheme, self).exportChildren(outfile, level, namespace_, name_, True, pretty_print=pretty_print)
         if pretty_print:
@@ -7192,38 +6580,13 @@ class ReactionScheme(Base):
             eol_ = ''
         for obj_ in self.anytypeobjs_:
             obj_.export(outfile, level, namespace_, pretty_print=pretty_print)
-    def exportLiteral(self, outfile, level, name_='ReactionScheme'):
-        level += 1
-        already_processed = set()
-        self.exportLiteralAttributes(outfile, level, already_processed, name_)
-        if self.hasContent_():
-            self.exportLiteralChildren(outfile, level, name_)
-    def exportLiteralAttributes(self, outfile, level, already_processed, name_):
-        if self.source is not None and 'source' not in already_processed:
-            already_processed.add('source')
-            showIndent(outfile, level)
-            outfile.write('source="%s",\n' % (self.source,))
-        if self.type_ is not None and 'type_' not in already_processed:
-            already_processed.add('type_')
-            showIndent(outfile, level)
-            outfile.write('type_="%s",\n' % (self.type_,))
-        super(ReactionScheme, self).exportLiteralAttributes(outfile, level, already_processed, name_)
-    def exportLiteralChildren(self, outfile, level, name_):
-        super(ReactionScheme, self).exportLiteralChildren(outfile, level, name_)
-        showIndent(outfile, level)
-        outfile.write('anytypeobjs_=[\n')
-        level += 1
-        for anytypeobjs_ in self.anytypeobjs_:
-            anytypeobjs_.exportLiteral(outfile, level)
-        level -= 1
-        showIndent(outfile, level)
-        outfile.write('],\n')
     def build(self, node):
         already_processed = set()
         self.buildAttributes(node, node.attrib, already_processed)
         for child in node:
             nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
             self.buildChildren(child, node, nodeName_)
+        return self
     def buildAttributes(self, node, attrs, already_processed):
         value = find_attr_value_('source', node)
         if value is not None and 'source' not in already_processed:
@@ -7246,6 +6609,7 @@ class ExtracellularProperties(Base):
     subclass = None
     superclass = Base
     def __init__(self, id=None, neuroLexId=None, temperature=None, species=None):
+        self.original_tagname_ = None
         super(ExtracellularProperties, self).__init__(id, neuroLexId, )
         self.temperature = _cast(None, temperature)
         if species is None:
@@ -7253,6 +6617,11 @@ class ExtracellularProperties(Base):
         else:
             self.species = species
     def factory(*args_, **kwargs_):
+        if CurrentSubclassModule_ is not None:
+            subclass = getSubclassFromModule_(
+                CurrentSubclassModule_, ExtracellularProperties)
+            if subclass is not None:
+                return subclass(*args_, **kwargs_)
         if ExtracellularProperties.subclass:
             return ExtracellularProperties.subclass(*args_, **kwargs_)
         else:
@@ -7261,12 +6630,17 @@ class ExtracellularProperties(Base):
     def get_species(self): return self.species
     def set_species(self, species): self.species = species
     def add_species(self, value): self.species.append(value)
-    def insert_species(self, index, value): self.species[index] = value
+    def insert_species_at(self, index, value): self.species.insert(index, value)
+    def replace_species_at(self, index, value): self.species[index] = value
     def get_temperature(self): return self.temperature
     def set_temperature(self, temperature): self.temperature = temperature
     def validate_Nml2Quantity_temperature(self, value):
         # Validate type Nml2Quantity_temperature, a restriction on xs:string.
-        pass
+        if value is not None and Validate_simpletypes_:
+            if not self.gds_validate_simple_patterns(
+                    self.validate_Nml2Quantity_temperature_patterns_, value):
+                warnings_.warn('Value "%s" does not match xsd pattern restrictions: %s' % (value.encode('utf-8'), self.validate_Nml2Quantity_temperature_patterns_, ))
+    validate_Nml2Quantity_temperature_patterns_ = [['^-?([0-9]*(\\.[0-9]+)?)([eE]-?[0-9]+)?[\\s]*(degC)$']]
     def hasContent_(self):
         if (
             self.species or
@@ -7280,13 +6654,15 @@ class ExtracellularProperties(Base):
             eol_ = '\n'
         else:
             eol_ = ''
+        if self.original_tagname_ is not None:
+            name_ = self.original_tagname_
         showIndent(outfile, level, pretty_print)
         outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
         already_processed = set()
         self.exportAttributes(outfile, level, already_processed, namespace_, name_='ExtracellularProperties')
         if self.hasContent_():
             outfile.write('>%s' % (eol_, ))
-            self.exportChildren(outfile, level + 1, namespace_, name_, pretty_print=pretty_print)
+            self.exportChildren(outfile, level + 1, namespace_='', name_='ExtracellularProperties', pretty_print=pretty_print)
             showIndent(outfile, level, pretty_print)
             outfile.write('</%s%s>%s' % (namespace_, name_, eol_))
         else:
@@ -7304,38 +6680,13 @@ class ExtracellularProperties(Base):
             eol_ = ''
         for species_ in self.species:
             species_.export(outfile, level, namespace_, name_='species', pretty_print=pretty_print)
-    def exportLiteral(self, outfile, level, name_='ExtracellularProperties'):
-        level += 1
-        already_processed = set()
-        self.exportLiteralAttributes(outfile, level, already_processed, name_)
-        if self.hasContent_():
-            self.exportLiteralChildren(outfile, level, name_)
-    def exportLiteralAttributes(self, outfile, level, already_processed, name_):
-        if self.temperature is not None and 'temperature' not in already_processed:
-            already_processed.add('temperature')
-            showIndent(outfile, level)
-            outfile.write('temperature="%s",\n' % (self.temperature,))
-        super(ExtracellularProperties, self).exportLiteralAttributes(outfile, level, already_processed, name_)
-    def exportLiteralChildren(self, outfile, level, name_):
-        super(ExtracellularProperties, self).exportLiteralChildren(outfile, level, name_)
-        showIndent(outfile, level)
-        outfile.write('species=[\n')
-        level += 1
-        for species_ in self.species:
-            showIndent(outfile, level)
-            outfile.write('model_.Species(\n')
-            species_.exportLiteral(outfile, level, name_='Species')
-            showIndent(outfile, level)
-            outfile.write('),\n')
-        level -= 1
-        showIndent(outfile, level)
-        outfile.write('],\n')
     def build(self, node):
         already_processed = set()
         self.buildAttributes(node, node.attrib, already_processed)
         for child in node:
             nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
             self.buildChildren(child, node, nodeName_)
+        return self
     def buildAttributes(self, node, attrs, already_processed):
         value = find_attr_value_('temperature', node)
         if value is not None and 'temperature' not in already_processed:
@@ -7348,6 +6699,7 @@ class ExtracellularProperties(Base):
             obj_ = Species.factory()
             obj_.build(child_)
             self.species.append(obj_)
+            obj_.original_tagname_ = 'species'
         super(ExtracellularProperties, self).buildChildren(child_, node, nodeName_, True)
 # end class ExtracellularProperties
 
@@ -7361,19 +6713,25 @@ class ChannelDensity(Base):
     ionChannel element. TODO: remove."""
     subclass = None
     superclass = Base
-    def __init__(self, id=None, neuroLexId=None, segmentGroup='all', ion=None, ionChannel=None, erev=None, condDensity=None, segment=None, variableParameter=None):
+    def __init__(self, id=None, neuroLexId=None, ionChannel=None, condDensity=None, erev=None, segmentGroup='all', segment=None, ion=None, variableParameter=None):
+        self.original_tagname_ = None
         super(ChannelDensity, self).__init__(id, neuroLexId, )
-        self.segmentGroup = _cast(None, segmentGroup)
-        self.ion = _cast(None, ion)
         self.ionChannel = _cast(None, ionChannel)
-        self.erev = _cast(None, erev)
         self.condDensity = _cast(None, condDensity)
+        self.erev = _cast(None, erev)
+        self.segmentGroup = _cast(None, segmentGroup)
         self.segment = _cast(None, segment)
+        self.ion = _cast(None, ion)
         if variableParameter is None:
             self.variableParameter = []
         else:
             self.variableParameter = variableParameter
     def factory(*args_, **kwargs_):
+        if CurrentSubclassModule_ is not None:
+            subclass = getSubclassFromModule_(
+                CurrentSubclassModule_, ChannelDensity)
+            if subclass is not None:
+                return subclass(*args_, **kwargs_)
         if ChannelDensity.subclass:
             return ChannelDensity.subclass(*args_, **kwargs_)
         else:
@@ -7382,28 +6740,41 @@ class ChannelDensity(Base):
     def get_variableParameter(self): return self.variableParameter
     def set_variableParameter(self, variableParameter): self.variableParameter = variableParameter
     def add_variableParameter(self, value): self.variableParameter.append(value)
-    def insert_variableParameter(self, index, value): self.variableParameter[index] = value
-    def get_segmentGroup(self): return self.segmentGroup
-    def set_segmentGroup(self, segmentGroup): self.segmentGroup = segmentGroup
-    def validate_NmlId(self, value):
-        # Validate type NmlId, a restriction on xs:string.
-        pass
-    def get_ion(self): return self.ion
-    def set_ion(self, ion): self.ion = ion
+    def insert_variableParameter_at(self, index, value): self.variableParameter.insert(index, value)
+    def replace_variableParameter_at(self, index, value): self.variableParameter[index] = value
     def get_ionChannel(self): return self.ionChannel
     def set_ionChannel(self, ionChannel): self.ionChannel = ionChannel
-    def get_erev(self): return self.erev
-    def set_erev(self, erev): self.erev = erev
-    def validate_Nml2Quantity_voltage(self, value):
-        # Validate type Nml2Quantity_voltage, a restriction on xs:string.
-        pass
     def get_condDensity(self): return self.condDensity
     def set_condDensity(self, condDensity): self.condDensity = condDensity
-    def validate_Nml2Quantity_conductanceDensity(self, value):
-        # Validate type Nml2Quantity_conductanceDensity, a restriction on xs:string.
-        pass
+    def get_erev(self): return self.erev
+    def set_erev(self, erev): self.erev = erev
+    def get_segmentGroup(self): return self.segmentGroup
+    def set_segmentGroup(self, segmentGroup): self.segmentGroup = segmentGroup
     def get_segment(self): return self.segment
     def set_segment(self, segment): self.segment = segment
+    def get_ion(self): return self.ion
+    def set_ion(self, ion): self.ion = ion
+    def validate_NmlId(self, value):
+        # Validate type NmlId, a restriction on xs:string.
+        if value is not None and Validate_simpletypes_:
+            if not self.gds_validate_simple_patterns(
+                    self.validate_NmlId_patterns_, value):
+                warnings_.warn('Value "%s" does not match xsd pattern restrictions: %s' % (value.encode('utf-8'), self.validate_NmlId_patterns_, ))
+    validate_NmlId_patterns_ = [['^[a-zA-Z0-9_]*$']]
+    def validate_Nml2Quantity_conductanceDensity(self, value):
+        # Validate type Nml2Quantity_conductanceDensity, a restriction on xs:string.
+        if value is not None and Validate_simpletypes_:
+            if not self.gds_validate_simple_patterns(
+                    self.validate_Nml2Quantity_conductanceDensity_patterns_, value):
+                warnings_.warn('Value "%s" does not match xsd pattern restrictions: %s' % (value.encode('utf-8'), self.validate_Nml2Quantity_conductanceDensity_patterns_, ))
+    validate_Nml2Quantity_conductanceDensity_patterns_ = [['^-?([0-9]*(\\.[0-9]+)?)([eE]-?[0-9]+)?[\\s]*(S_per_m2|mS_per_cm2|S_per_cm2)$']]
+    def validate_Nml2Quantity_voltage(self, value):
+        # Validate type Nml2Quantity_voltage, a restriction on xs:string.
+        if value is not None and Validate_simpletypes_:
+            if not self.gds_validate_simple_patterns(
+                    self.validate_Nml2Quantity_voltage_patterns_, value):
+                warnings_.warn('Value "%s" does not match xsd pattern restrictions: %s' % (value.encode('utf-8'), self.validate_Nml2Quantity_voltage_patterns_, ))
+    validate_Nml2Quantity_voltage_patterns_ = [['^-?([0-9]*(\\.[0-9]+)?)([eE]-?[0-9]+)?[\\s]*(V|mV)$']]
     def hasContent_(self):
         if (
             self.variableParameter or
@@ -7417,37 +6788,39 @@ class ChannelDensity(Base):
             eol_ = '\n'
         else:
             eol_ = ''
+        if self.original_tagname_ is not None:
+            name_ = self.original_tagname_
         showIndent(outfile, level, pretty_print)
         outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
         already_processed = set()
         self.exportAttributes(outfile, level, already_processed, namespace_, name_='ChannelDensity')
         if self.hasContent_():
             outfile.write('>%s' % (eol_, ))
-            self.exportChildren(outfile, level + 1, namespace_, name_, pretty_print=pretty_print)
+            self.exportChildren(outfile, level + 1, namespace_='', name_='ChannelDensity', pretty_print=pretty_print)
             showIndent(outfile, level, pretty_print)
             outfile.write('</%s%s>%s' % (namespace_, name_, eol_))
         else:
             outfile.write('/>%s' % (eol_, ))
     def exportAttributes(self, outfile, level, already_processed, namespace_='', name_='ChannelDensity'):
         super(ChannelDensity, self).exportAttributes(outfile, level, already_processed, namespace_, name_='ChannelDensity')
-        if self.segmentGroup is not None and 'segmentGroup' not in already_processed:
-            already_processed.add('segmentGroup')
-            outfile.write(' segmentGroup=%s' % (quote_attrib(self.segmentGroup), ))
-        if self.ion is not None and 'ion' not in already_processed:
-            already_processed.add('ion')
-            outfile.write(' ion=%s' % (quote_attrib(self.ion), ))
         if self.ionChannel is not None and 'ionChannel' not in already_processed:
             already_processed.add('ionChannel')
             outfile.write(' ionChannel=%s' % (quote_attrib(self.ionChannel), ))
-        if self.erev is not None and 'erev' not in already_processed:
-            already_processed.add('erev')
-            outfile.write(' erev=%s' % (quote_attrib(self.erev), ))
         if self.condDensity is not None and 'condDensity' not in already_processed:
             already_processed.add('condDensity')
             outfile.write(' condDensity=%s' % (quote_attrib(self.condDensity), ))
+        if self.erev is not None and 'erev' not in already_processed:
+            already_processed.add('erev')
+            outfile.write(' erev=%s' % (quote_attrib(self.erev), ))
+        if self.segmentGroup != "all" and 'segmentGroup' not in already_processed:
+            already_processed.add('segmentGroup')
+            outfile.write(' segmentGroup=%s' % (quote_attrib(self.segmentGroup), ))
         if self.segment is not None and 'segment' not in already_processed:
             already_processed.add('segment')
             outfile.write(' segment=%s' % (quote_attrib(self.segment), ))
+        if self.ion is not None and 'ion' not in already_processed:
+            already_processed.add('ion')
+            outfile.write(' ion=%s' % (quote_attrib(self.ion), ))
     def exportChildren(self, outfile, level, namespace_='', name_='ChannelDensity', fromsubclass_=False, pretty_print=True):
         super(ChannelDensity, self).exportChildren(outfile, level, namespace_, name_, True, pretty_print=pretty_print)
         if pretty_print:
@@ -7456,95 +6829,51 @@ class ChannelDensity(Base):
             eol_ = ''
         for variableParameter_ in self.variableParameter:
             variableParameter_.export(outfile, level, namespace_, name_='variableParameter', pretty_print=pretty_print)
-    def exportLiteral(self, outfile, level, name_='ChannelDensity'):
-        level += 1
-        already_processed = set()
-        self.exportLiteralAttributes(outfile, level, already_processed, name_)
-        if self.hasContent_():
-            self.exportLiteralChildren(outfile, level, name_)
-    def exportLiteralAttributes(self, outfile, level, already_processed, name_):
-        if self.segmentGroup is not None and 'segmentGroup' not in already_processed:
-            already_processed.add('segmentGroup')
-            showIndent(outfile, level)
-            outfile.write('segmentGroup="%s",\n' % (self.segmentGroup,))
-        if self.ion is not None and 'ion' not in already_processed:
-            already_processed.add('ion')
-            showIndent(outfile, level)
-            outfile.write('ion="%s",\n' % (self.ion,))
-        if self.ionChannel is not None and 'ionChannel' not in already_processed:
-            already_processed.add('ionChannel')
-            showIndent(outfile, level)
-            outfile.write('ionChannel="%s",\n' % (self.ionChannel,))
-        if self.erev is not None and 'erev' not in already_processed:
-            already_processed.add('erev')
-            showIndent(outfile, level)
-            outfile.write('erev="%s",\n' % (self.erev,))
-        if self.condDensity is not None and 'condDensity' not in already_processed:
-            already_processed.add('condDensity')
-            showIndent(outfile, level)
-            outfile.write('condDensity="%s",\n' % (self.condDensity,))
-        if self.segment is not None and 'segment' not in already_processed:
-            already_processed.add('segment')
-            showIndent(outfile, level)
-            outfile.write('segment="%s",\n' % (self.segment,))
-        super(ChannelDensity, self).exportLiteralAttributes(outfile, level, already_processed, name_)
-    def exportLiteralChildren(self, outfile, level, name_):
-        super(ChannelDensity, self).exportLiteralChildren(outfile, level, name_)
-        showIndent(outfile, level)
-        outfile.write('variableParameter=[\n')
-        level += 1
-        for variableParameter_ in self.variableParameter:
-            showIndent(outfile, level)
-            outfile.write('model_.VariableParameter(\n')
-            variableParameter_.exportLiteral(outfile, level, name_='VariableParameter')
-            showIndent(outfile, level)
-            outfile.write('),\n')
-        level -= 1
-        showIndent(outfile, level)
-        outfile.write('],\n')
     def build(self, node):
         already_processed = set()
         self.buildAttributes(node, node.attrib, already_processed)
         for child in node:
             nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
             self.buildChildren(child, node, nodeName_)
+        return self
     def buildAttributes(self, node, attrs, already_processed):
-        value = find_attr_value_('segmentGroup', node)
-        if value is not None and 'segmentGroup' not in already_processed:
-            already_processed.add('segmentGroup')
-            self.segmentGroup = value
-            self.validate_NmlId(self.segmentGroup)    # validate type NmlId
-        value = find_attr_value_('ion', node)
-        if value is not None and 'ion' not in already_processed:
-            already_processed.add('ion')
-            self.ion = value
-            self.validate_NmlId(self.ion)    # validate type NmlId
         value = find_attr_value_('ionChannel', node)
         if value is not None and 'ionChannel' not in already_processed:
             already_processed.add('ionChannel')
             self.ionChannel = value
             self.validate_NmlId(self.ionChannel)    # validate type NmlId
-        value = find_attr_value_('erev', node)
-        if value is not None and 'erev' not in already_processed:
-            already_processed.add('erev')
-            self.erev = value
-            self.validate_Nml2Quantity_voltage(self.erev)    # validate type Nml2Quantity_voltage
         value = find_attr_value_('condDensity', node)
         if value is not None and 'condDensity' not in already_processed:
             already_processed.add('condDensity')
             self.condDensity = value
             self.validate_Nml2Quantity_conductanceDensity(self.condDensity)    # validate type Nml2Quantity_conductanceDensity
+        value = find_attr_value_('erev', node)
+        if value is not None and 'erev' not in already_processed:
+            already_processed.add('erev')
+            self.erev = value
+            self.validate_Nml2Quantity_voltage(self.erev)    # validate type Nml2Quantity_voltage
+        value = find_attr_value_('segmentGroup', node)
+        if value is not None and 'segmentGroup' not in already_processed:
+            already_processed.add('segmentGroup')
+            self.segmentGroup = value
+            self.validate_NmlId(self.segmentGroup)    # validate type NmlId
         value = find_attr_value_('segment', node)
         if value is not None and 'segment' not in already_processed:
             already_processed.add('segment')
             self.segment = value
             self.validate_NmlId(self.segment)    # validate type NmlId
+        value = find_attr_value_('ion', node)
+        if value is not None and 'ion' not in already_processed:
+            already_processed.add('ion')
+            self.ion = value
+            self.validate_NmlId(self.ion)    # validate type NmlId
         super(ChannelDensity, self).buildAttributes(node, attrs, already_processed)
     def buildChildren(self, child_, node, nodeName_, fromsubclass_=False):
         if nodeName_ == 'variableParameter':
             obj_ = VariableParameter.factory()
             obj_.build(child_)
             self.variableParameter.append(obj_)
+            obj_.original_tagname_ = 'variableParameter'
         super(ChannelDensity, self).buildChildren(child_, node, nodeName_, True)
 # end class ChannelDensity
 
@@ -7558,19 +6887,25 @@ class ChannelPopulation(Base):
     ionChannel element. TODO: remove."""
     subclass = None
     superclass = Base
-    def __init__(self, id=None, neuroLexId=None, segmentGroup='all', ion=None, number=None, ionChannel=None, erev=None, segment=None, variableParameter=None):
+    def __init__(self, id=None, neuroLexId=None, ionChannel=None, number=None, erev=None, segmentGroup='all', segment=None, ion=None, variableParameter=None):
+        self.original_tagname_ = None
         super(ChannelPopulation, self).__init__(id, neuroLexId, )
-        self.segmentGroup = _cast(None, segmentGroup)
-        self.ion = _cast(None, ion)
-        self.number = _cast(int, number)
         self.ionChannel = _cast(None, ionChannel)
+        self.number = _cast(int, number)
         self.erev = _cast(None, erev)
+        self.segmentGroup = _cast(None, segmentGroup)
         self.segment = _cast(None, segment)
+        self.ion = _cast(None, ion)
         if variableParameter is None:
             self.variableParameter = []
         else:
             self.variableParameter = variableParameter
     def factory(*args_, **kwargs_):
+        if CurrentSubclassModule_ is not None:
+            subclass = getSubclassFromModule_(
+                CurrentSubclassModule_, ChannelPopulation)
+            if subclass is not None:
+                return subclass(*args_, **kwargs_)
         if ChannelPopulation.subclass:
             return ChannelPopulation.subclass(*args_, **kwargs_)
         else:
@@ -7579,25 +6914,34 @@ class ChannelPopulation(Base):
     def get_variableParameter(self): return self.variableParameter
     def set_variableParameter(self, variableParameter): self.variableParameter = variableParameter
     def add_variableParameter(self, value): self.variableParameter.append(value)
-    def insert_variableParameter(self, index, value): self.variableParameter[index] = value
-    def get_segmentGroup(self): return self.segmentGroup
-    def set_segmentGroup(self, segmentGroup): self.segmentGroup = segmentGroup
-    def validate_NmlId(self, value):
-        # Validate type NmlId, a restriction on xs:string.
-        pass
-    def get_ion(self): return self.ion
-    def set_ion(self, ion): self.ion = ion
-    def get_number(self): return self.number
-    def set_number(self, number): self.number = number
+    def insert_variableParameter_at(self, index, value): self.variableParameter.insert(index, value)
+    def replace_variableParameter_at(self, index, value): self.variableParameter[index] = value
     def get_ionChannel(self): return self.ionChannel
     def set_ionChannel(self, ionChannel): self.ionChannel = ionChannel
+    def get_number(self): return self.number
+    def set_number(self, number): self.number = number
     def get_erev(self): return self.erev
     def set_erev(self, erev): self.erev = erev
-    def validate_Nml2Quantity_voltage(self, value):
-        # Validate type Nml2Quantity_voltage, a restriction on xs:string.
-        pass
+    def get_segmentGroup(self): return self.segmentGroup
+    def set_segmentGroup(self, segmentGroup): self.segmentGroup = segmentGroup
     def get_segment(self): return self.segment
     def set_segment(self, segment): self.segment = segment
+    def get_ion(self): return self.ion
+    def set_ion(self, ion): self.ion = ion
+    def validate_NmlId(self, value):
+        # Validate type NmlId, a restriction on xs:string.
+        if value is not None and Validate_simpletypes_:
+            if not self.gds_validate_simple_patterns(
+                    self.validate_NmlId_patterns_, value):
+                warnings_.warn('Value "%s" does not match xsd pattern restrictions: %s' % (value.encode('utf-8'), self.validate_NmlId_patterns_, ))
+    validate_NmlId_patterns_ = [['^[a-zA-Z0-9_]*$']]
+    def validate_Nml2Quantity_voltage(self, value):
+        # Validate type Nml2Quantity_voltage, a restriction on xs:string.
+        if value is not None and Validate_simpletypes_:
+            if not self.gds_validate_simple_patterns(
+                    self.validate_Nml2Quantity_voltage_patterns_, value):
+                warnings_.warn('Value "%s" does not match xsd pattern restrictions: %s' % (value.encode('utf-8'), self.validate_Nml2Quantity_voltage_patterns_, ))
+    validate_Nml2Quantity_voltage_patterns_ = [['^-?([0-9]*(\\.[0-9]+)?)([eE]-?[0-9]+)?[\\s]*(V|mV)$']]
     def hasContent_(self):
         if (
             self.variableParameter or
@@ -7611,37 +6955,39 @@ class ChannelPopulation(Base):
             eol_ = '\n'
         else:
             eol_ = ''
+        if self.original_tagname_ is not None:
+            name_ = self.original_tagname_
         showIndent(outfile, level, pretty_print)
         outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
         already_processed = set()
         self.exportAttributes(outfile, level, already_processed, namespace_, name_='ChannelPopulation')
         if self.hasContent_():
             outfile.write('>%s' % (eol_, ))
-            self.exportChildren(outfile, level + 1, namespace_, name_, pretty_print=pretty_print)
+            self.exportChildren(outfile, level + 1, namespace_='', name_='ChannelPopulation', pretty_print=pretty_print)
             showIndent(outfile, level, pretty_print)
             outfile.write('</%s%s>%s' % (namespace_, name_, eol_))
         else:
             outfile.write('/>%s' % (eol_, ))
     def exportAttributes(self, outfile, level, already_processed, namespace_='', name_='ChannelPopulation'):
         super(ChannelPopulation, self).exportAttributes(outfile, level, already_processed, namespace_, name_='ChannelPopulation')
-        if self.segmentGroup is not None and 'segmentGroup' not in already_processed:
-            already_processed.add('segmentGroup')
-            outfile.write(' segmentGroup=%s' % (quote_attrib(self.segmentGroup), ))
-        if self.ion is not None and 'ion' not in already_processed:
-            already_processed.add('ion')
-            outfile.write(' ion=%s' % (quote_attrib(self.ion), ))
-        if self.number is not None and 'number' not in already_processed:
-            already_processed.add('number')
-            outfile.write(' number="%s"' % self.gds_format_integer(self.number, input_name='number'))
         if self.ionChannel is not None and 'ionChannel' not in already_processed:
             already_processed.add('ionChannel')
             outfile.write(' ionChannel=%s' % (quote_attrib(self.ionChannel), ))
+        if self.number is not None and 'number' not in already_processed:
+            already_processed.add('number')
+            outfile.write(' number="%s"' % self.gds_format_integer(self.number, input_name='number'))
         if self.erev is not None and 'erev' not in already_processed:
             already_processed.add('erev')
             outfile.write(' erev=%s' % (quote_attrib(self.erev), ))
+        if self.segmentGroup != "all" and 'segmentGroup' not in already_processed:
+            already_processed.add('segmentGroup')
+            outfile.write(' segmentGroup=%s' % (quote_attrib(self.segmentGroup), ))
         if self.segment is not None and 'segment' not in already_processed:
             already_processed.add('segment')
             outfile.write(' segment=%s' % (quote_attrib(self.segment), ))
+        if self.ion is not None and 'ion' not in already_processed:
+            already_processed.add('ion')
+            outfile.write(' ion=%s' % (quote_attrib(self.ion), ))
     def exportChildren(self, outfile, level, namespace_='', name_='ChannelPopulation', fromsubclass_=False, pretty_print=True):
         super(ChannelPopulation, self).exportChildren(outfile, level, namespace_, name_, True, pretty_print=pretty_print)
         if pretty_print:
@@ -7650,69 +6996,19 @@ class ChannelPopulation(Base):
             eol_ = ''
         for variableParameter_ in self.variableParameter:
             variableParameter_.export(outfile, level, namespace_, name_='variableParameter', pretty_print=pretty_print)
-    def exportLiteral(self, outfile, level, name_='ChannelPopulation'):
-        level += 1
-        already_processed = set()
-        self.exportLiteralAttributes(outfile, level, already_processed, name_)
-        if self.hasContent_():
-            self.exportLiteralChildren(outfile, level, name_)
-    def exportLiteralAttributes(self, outfile, level, already_processed, name_):
-        if self.segmentGroup is not None and 'segmentGroup' not in already_processed:
-            already_processed.add('segmentGroup')
-            showIndent(outfile, level)
-            outfile.write('segmentGroup="%s",\n' % (self.segmentGroup,))
-        if self.ion is not None and 'ion' not in already_processed:
-            already_processed.add('ion')
-            showIndent(outfile, level)
-            outfile.write('ion="%s",\n' % (self.ion,))
-        if self.number is not None and 'number' not in already_processed:
-            already_processed.add('number')
-            showIndent(outfile, level)
-            outfile.write('number=%d,\n' % (self.number,))
-        if self.ionChannel is not None and 'ionChannel' not in already_processed:
-            already_processed.add('ionChannel')
-            showIndent(outfile, level)
-            outfile.write('ionChannel="%s",\n' % (self.ionChannel,))
-        if self.erev is not None and 'erev' not in already_processed:
-            already_processed.add('erev')
-            showIndent(outfile, level)
-            outfile.write('erev="%s",\n' % (self.erev,))
-        if self.segment is not None and 'segment' not in already_processed:
-            already_processed.add('segment')
-            showIndent(outfile, level)
-            outfile.write('segment="%s",\n' % (self.segment,))
-        super(ChannelPopulation, self).exportLiteralAttributes(outfile, level, already_processed, name_)
-    def exportLiteralChildren(self, outfile, level, name_):
-        super(ChannelPopulation, self).exportLiteralChildren(outfile, level, name_)
-        showIndent(outfile, level)
-        outfile.write('variableParameter=[\n')
-        level += 1
-        for variableParameter_ in self.variableParameter:
-            showIndent(outfile, level)
-            outfile.write('model_.VariableParameter(\n')
-            variableParameter_.exportLiteral(outfile, level, name_='VariableParameter')
-            showIndent(outfile, level)
-            outfile.write('),\n')
-        level -= 1
-        showIndent(outfile, level)
-        outfile.write('],\n')
     def build(self, node):
         already_processed = set()
         self.buildAttributes(node, node.attrib, already_processed)
         for child in node:
             nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
             self.buildChildren(child, node, nodeName_)
+        return self
     def buildAttributes(self, node, attrs, already_processed):
-        value = find_attr_value_('segmentGroup', node)
-        if value is not None and 'segmentGroup' not in already_processed:
-            already_processed.add('segmentGroup')
-            self.segmentGroup = value
-            self.validate_NmlId(self.segmentGroup)    # validate type NmlId
-        value = find_attr_value_('ion', node)
-        if value is not None and 'ion' not in already_processed:
-            already_processed.add('ion')
-            self.ion = value
-            self.validate_NmlId(self.ion)    # validate type NmlId
+        value = find_attr_value_('ionChannel', node)
+        if value is not None and 'ionChannel' not in already_processed:
+            already_processed.add('ionChannel')
+            self.ionChannel = value
+            self.validate_NmlId(self.ionChannel)    # validate type NmlId
         value = find_attr_value_('number', node)
         if value is not None and 'number' not in already_processed:
             already_processed.add('number')
@@ -7722,27 +7018,33 @@ class ChannelPopulation(Base):
                 raise_parse_error(node, 'Bad integer attribute: %s' % exp)
             if self.number < 0:
                 raise_parse_error(node, 'Invalid NonNegativeInteger')
-        value = find_attr_value_('ionChannel', node)
-        if value is not None and 'ionChannel' not in already_processed:
-            already_processed.add('ionChannel')
-            self.ionChannel = value
-            self.validate_NmlId(self.ionChannel)    # validate type NmlId
         value = find_attr_value_('erev', node)
         if value is not None and 'erev' not in already_processed:
             already_processed.add('erev')
             self.erev = value
             self.validate_Nml2Quantity_voltage(self.erev)    # validate type Nml2Quantity_voltage
-        value = find_attr_value_('segment', node)
-        if value is not None and 'segment' not in already_processed:
-            already_processed.add('segment')
-            self.segment = value
+        value = find_attr_value_('segmentGroup', node)
+        if value is not None and 'segmentGroup' not in already_processed:
+            already_processed.add('segmentGroup')
+            self.segmentGroup = value
+            self.validate_NmlId(self.segmentGroup)    # validate type NmlId
+        value = find_attr_value_('segment', node)
+        if value is not None and 'segment' not in already_processed:
+            already_processed.add('segment')
+            self.segment = value
             self.validate_NmlId(self.segment)    # validate type NmlId
+        value = find_attr_value_('ion', node)
+        if value is not None and 'ion' not in already_processed:
+            already_processed.add('ion')
+            self.ion = value
+            self.validate_NmlId(self.ion)    # validate type NmlId
         super(ChannelPopulation, self).buildAttributes(node, attrs, already_processed)
     def buildChildren(self, child_, node, nodeName_, fromsubclass_=False):
         if nodeName_ == 'variableParameter':
             obj_ = VariableParameter.factory()
             obj_.build(child_)
             self.variableParameter.append(obj_)
+            obj_.original_tagname_ = 'variableParameter'
         super(ChannelPopulation, self).buildChildren(child_, node, nodeName_, True)
 # end class ChannelPopulation
 
@@ -7753,11 +7055,17 @@ class BiophysicalProperties(Standalone):
     subclass = None
     superclass = Standalone
     def __init__(self, id=None, neuroLexId=None, name=None, metaid=None, notes=None, annotation=None, membraneProperties=None, intracellularProperties=None, extracellularProperties=None):
+        self.original_tagname_ = None
         super(BiophysicalProperties, self).__init__(id, neuroLexId, name, metaid, notes, annotation, )
         self.membraneProperties = membraneProperties
         self.intracellularProperties = intracellularProperties
         self.extracellularProperties = extracellularProperties
     def factory(*args_, **kwargs_):
+        if CurrentSubclassModule_ is not None:
+            subclass = getSubclassFromModule_(
+                CurrentSubclassModule_, BiophysicalProperties)
+            if subclass is not None:
+                return subclass(*args_, **kwargs_)
         if BiophysicalProperties.subclass:
             return BiophysicalProperties.subclass(*args_, **kwargs_)
         else:
@@ -7784,13 +7092,15 @@ class BiophysicalProperties(Standalone):
             eol_ = '\n'
         else:
             eol_ = ''
+        if self.original_tagname_ is not None:
+            name_ = self.original_tagname_
         showIndent(outfile, level, pretty_print)
         outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
         already_processed = set()
         self.exportAttributes(outfile, level, already_processed, namespace_, name_='BiophysicalProperties')
         if self.hasContent_():
             outfile.write('>%s' % (eol_, ))
-            self.exportChildren(outfile, level + 1, namespace_, name_, pretty_print=pretty_print)
+            self.exportChildren(outfile, level + 1, namespace_='', name_='BiophysicalProperties', pretty_print=pretty_print)
             showIndent(outfile, level, pretty_print)
             outfile.write('</%s%s>%s' % (namespace_, name_, eol_))
         else:
@@ -7809,55 +7119,31 @@ class BiophysicalProperties(Standalone):
             self.intracellularProperties.export(outfile, level, namespace_, name_='intracellularProperties', pretty_print=pretty_print)
         if self.extracellularProperties is not None:
             self.extracellularProperties.export(outfile, level, namespace_, name_='extracellularProperties', pretty_print=pretty_print)
-    def exportLiteral(self, outfile, level, name_='BiophysicalProperties'):
-        level += 1
-        already_processed = set()
-        self.exportLiteralAttributes(outfile, level, already_processed, name_)
-        if self.hasContent_():
-            self.exportLiteralChildren(outfile, level, name_)
-    def exportLiteralAttributes(self, outfile, level, already_processed, name_):
-        super(BiophysicalProperties, self).exportLiteralAttributes(outfile, level, already_processed, name_)
-    def exportLiteralChildren(self, outfile, level, name_):
-        super(BiophysicalProperties, self).exportLiteralChildren(outfile, level, name_)
-        if self.membraneProperties is not None:
-            showIndent(outfile, level)
-            outfile.write('membraneProperties=model_.MembraneProperties(\n')
-            self.membraneProperties.exportLiteral(outfile, level, name_='membraneProperties')
-            showIndent(outfile, level)
-            outfile.write('),\n')
-        if self.intracellularProperties is not None:
-            showIndent(outfile, level)
-            outfile.write('intracellularProperties=model_.IntracellularProperties(\n')
-            self.intracellularProperties.exportLiteral(outfile, level, name_='intracellularProperties')
-            showIndent(outfile, level)
-            outfile.write('),\n')
-        if self.extracellularProperties is not None:
-            showIndent(outfile, level)
-            outfile.write('extracellularProperties=model_.ExtracellularProperties(\n')
-            self.extracellularProperties.exportLiteral(outfile, level, name_='extracellularProperties')
-            showIndent(outfile, level)
-            outfile.write('),\n')
     def build(self, node):
         already_processed = set()
         self.buildAttributes(node, node.attrib, already_processed)
         for child in node:
             nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
             self.buildChildren(child, node, nodeName_)
+        return self
     def buildAttributes(self, node, attrs, already_processed):
         super(BiophysicalProperties, self).buildAttributes(node, attrs, already_processed)
     def buildChildren(self, child_, node, nodeName_, fromsubclass_=False):
         if nodeName_ == 'membraneProperties':
             obj_ = MembraneProperties.factory()
             obj_.build(child_)
-            self.set_membraneProperties(obj_)
+            self.membraneProperties = obj_
+            obj_.original_tagname_ = 'membraneProperties'
         elif nodeName_ == 'intracellularProperties':
             obj_ = IntracellularProperties.factory()
             obj_.build(child_)
-            self.set_intracellularProperties(obj_)
+            self.intracellularProperties = obj_
+            obj_.original_tagname_ = 'intracellularProperties'
         elif nodeName_ == 'extracellularProperties':
             obj_ = ExtracellularProperties.factory()
             obj_.build(child_)
-            self.set_extracellularProperties(obj_)
+            self.extracellularProperties = obj_
+            obj_.original_tagname_ = 'extracellularProperties'
         super(BiophysicalProperties, self).buildChildren(child_, node, nodeName_, True)
 # end class BiophysicalProperties
 
@@ -7866,12 +7152,18 @@ class InhomogeneousParam(Base):
     subclass = None
     superclass = Base
     def __init__(self, id=None, neuroLexId=None, variable=None, metric=None, proximal=None, distal=None):
+        self.original_tagname_ = None
         super(InhomogeneousParam, self).__init__(id, neuroLexId, )
         self.variable = _cast(None, variable)
         self.metric = _cast(None, metric)
         self.proximal = proximal
         self.distal = distal
     def factory(*args_, **kwargs_):
+        if CurrentSubclassModule_ is not None:
+            subclass = getSubclassFromModule_(
+                CurrentSubclassModule_, InhomogeneousParam)
+            if subclass is not None:
+                return subclass(*args_, **kwargs_)
         if InhomogeneousParam.subclass:
             return InhomogeneousParam.subclass(*args_, **kwargs_)
         else:
@@ -7887,7 +7179,16 @@ class InhomogeneousParam(Base):
     def set_metric(self, metric): self.metric = metric
     def validate_Metric(self, value):
         # Validate type Metric, a restriction on xs:string.
-        pass
+        if value is not None and Validate_simpletypes_:
+            value = str(value)
+            enumerations = ['Path Length from root']
+            enumeration_respectee = False
+            for enum in enumerations:
+                if value == enum:
+                    enumeration_respectee = True
+                    break
+            if not enumeration_respectee:
+                warnings_.warn('Value "%(value)s" does not match xsd enumeration restriction on Metric' % {"value" : value.encode("utf-8")} )
     def hasContent_(self):
         if (
             self.proximal is not None or
@@ -7902,13 +7203,15 @@ class InhomogeneousParam(Base):
             eol_ = '\n'
         else:
             eol_ = ''
+        if self.original_tagname_ is not None:
+            name_ = self.original_tagname_
         showIndent(outfile, level, pretty_print)
         outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
         already_processed = set()
         self.exportAttributes(outfile, level, already_processed, namespace_, name_='InhomogeneousParam')
         if self.hasContent_():
             outfile.write('>%s' % (eol_, ))
-            self.exportChildren(outfile, level + 1, namespace_, name_, pretty_print=pretty_print)
+            self.exportChildren(outfile, level + 1, namespace_='', name_='InhomogeneousParam', pretty_print=pretty_print)
             showIndent(outfile, level, pretty_print)
             outfile.write('</%s%s>%s' % (namespace_, name_, eol_))
         else:
@@ -7917,7 +7220,7 @@ class InhomogeneousParam(Base):
         super(InhomogeneousParam, self).exportAttributes(outfile, level, already_processed, namespace_, name_='InhomogeneousParam')
         if self.variable is not None and 'variable' not in already_processed:
             already_processed.add('variable')
-            outfile.write(' variable=%s' % (self.gds_format_string(quote_attrib(self.variable).encode(ExternalEncoding), input_name='variable'), ))
+            outfile.write(' variable=%s' % (self.gds_encode(self.gds_format_string(quote_attrib(self.variable), input_name='variable')), ))
         if self.metric is not None and 'metric' not in already_processed:
             already_processed.add('metric')
             outfile.write(' metric=%s' % (quote_attrib(self.metric), ))
@@ -7931,42 +7234,13 @@ class InhomogeneousParam(Base):
             self.proximal.export(outfile, level, namespace_, name_='proximal', pretty_print=pretty_print)
         if self.distal is not None:
             self.distal.export(outfile, level, namespace_, name_='distal', pretty_print=pretty_print)
-    def exportLiteral(self, outfile, level, name_='InhomogeneousParam'):
-        level += 1
-        already_processed = set()
-        self.exportLiteralAttributes(outfile, level, already_processed, name_)
-        if self.hasContent_():
-            self.exportLiteralChildren(outfile, level, name_)
-    def exportLiteralAttributes(self, outfile, level, already_processed, name_):
-        if self.variable is not None and 'variable' not in already_processed:
-            already_processed.add('variable')
-            showIndent(outfile, level)
-            outfile.write('variable="%s",\n' % (self.variable,))
-        if self.metric is not None and 'metric' not in already_processed:
-            already_processed.add('metric')
-            showIndent(outfile, level)
-            outfile.write('metric="%s",\n' % (self.metric,))
-        super(InhomogeneousParam, self).exportLiteralAttributes(outfile, level, already_processed, name_)
-    def exportLiteralChildren(self, outfile, level, name_):
-        super(InhomogeneousParam, self).exportLiteralChildren(outfile, level, name_)
-        if self.proximal is not None:
-            showIndent(outfile, level)
-            outfile.write('proximal=model_.ProximalDetails(\n')
-            self.proximal.exportLiteral(outfile, level, name_='proximal')
-            showIndent(outfile, level)
-            outfile.write('),\n')
-        if self.distal is not None:
-            showIndent(outfile, level)
-            outfile.write('distal=model_.DistalDetails(\n')
-            self.distal.exportLiteral(outfile, level, name_='distal')
-            showIndent(outfile, level)
-            outfile.write('),\n')
     def build(self, node):
         already_processed = set()
         self.buildAttributes(node, node.attrib, already_processed)
         for child in node:
             nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
             self.buildChildren(child, node, nodeName_)
+        return self
     def buildAttributes(self, node, attrs, already_processed):
         value = find_attr_value_('variable', node)
         if value is not None and 'variable' not in already_processed:
@@ -7982,11 +7256,13 @@ class InhomogeneousParam(Base):
         if nodeName_ == 'proximal':
             obj_ = ProximalDetails.factory()
             obj_.build(child_)
-            self.set_proximal(obj_)
+            self.proximal = obj_
+            obj_.original_tagname_ = 'proximal'
         elif nodeName_ == 'distal':
             obj_ = DistalDetails.factory()
             obj_.build(child_)
-            self.set_distal(obj_)
+            self.distal = obj_
+            obj_.original_tagname_ = 'distal'
         super(InhomogeneousParam, self).buildChildren(child_, node, nodeName_, True)
 # end class InhomogeneousParam
 
@@ -7995,6 +7271,7 @@ class SegmentGroup(Base):
     subclass = None
     superclass = Base
     def __init__(self, id=None, neuroLexId=None, member=None, include=None, path=None, subTree=None, inhomogeneousParam=None):
+        self.original_tagname_ = None
         super(SegmentGroup, self).__init__(id, neuroLexId, )
         if member is None:
             self.member = []
@@ -8017,6 +7294,11 @@ class SegmentGroup(Base):
         else:
             self.inhomogeneousParam = inhomogeneousParam
     def factory(*args_, **kwargs_):
+        if CurrentSubclassModule_ is not None:
+            subclass = getSubclassFromModule_(
+                CurrentSubclassModule_, SegmentGroup)
+            if subclass is not None:
+                return subclass(*args_, **kwargs_)
         if SegmentGroup.subclass:
             return SegmentGroup.subclass(*args_, **kwargs_)
         else:
@@ -8025,23 +7307,28 @@ class SegmentGroup(Base):
     def get_member(self): return self.member
     def set_member(self, member): self.member = member
     def add_member(self, value): self.member.append(value)
-    def insert_member(self, index, value): self.member[index] = value
+    def insert_member_at(self, index, value): self.member.insert(index, value)
+    def replace_member_at(self, index, value): self.member[index] = value
     def get_include(self): return self.include
     def set_include(self, include): self.include = include
     def add_include(self, value): self.include.append(value)
-    def insert_include(self, index, value): self.include[index] = value
+    def insert_include_at(self, index, value): self.include.insert(index, value)
+    def replace_include_at(self, index, value): self.include[index] = value
     def get_path(self): return self.path
     def set_path(self, path): self.path = path
     def add_path(self, value): self.path.append(value)
-    def insert_path(self, index, value): self.path[index] = value
+    def insert_path_at(self, index, value): self.path.insert(index, value)
+    def replace_path_at(self, index, value): self.path[index] = value
     def get_subTree(self): return self.subTree
     def set_subTree(self, subTree): self.subTree = subTree
     def add_subTree(self, value): self.subTree.append(value)
-    def insert_subTree(self, index, value): self.subTree[index] = value
+    def insert_subTree_at(self, index, value): self.subTree.insert(index, value)
+    def replace_subTree_at(self, index, value): self.subTree[index] = value
     def get_inhomogeneousParam(self): return self.inhomogeneousParam
     def set_inhomogeneousParam(self, inhomogeneousParam): self.inhomogeneousParam = inhomogeneousParam
     def add_inhomogeneousParam(self, value): self.inhomogeneousParam.append(value)
-    def insert_inhomogeneousParam(self, index, value): self.inhomogeneousParam[index] = value
+    def insert_inhomogeneousParam_at(self, index, value): self.inhomogeneousParam.insert(index, value)
+    def replace_inhomogeneousParam_at(self, index, value): self.inhomogeneousParam[index] = value
     def hasContent_(self):
         if (
             self.member or
@@ -8059,13 +7346,15 @@ class SegmentGroup(Base):
             eol_ = '\n'
         else:
             eol_ = ''
+        if self.original_tagname_ is not None:
+            name_ = self.original_tagname_
         showIndent(outfile, level, pretty_print)
         outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
         already_processed = set()
         self.exportAttributes(outfile, level, already_processed, namespace_, name_='SegmentGroup')
         if self.hasContent_():
             outfile.write('>%s' % (eol_, ))
-            self.exportChildren(outfile, level + 1, namespace_, name_, pretty_print=pretty_print)
+            self.exportChildren(outfile, level + 1, namespace_='', name_='SegmentGroup', pretty_print=pretty_print)
             showIndent(outfile, level, pretty_print)
             outfile.write('</%s%s>%s' % (namespace_, name_, eol_))
         else:
@@ -8088,82 +7377,13 @@ class SegmentGroup(Base):
             subTree_.export(outfile, level, namespace_, name_='subTree', pretty_print=pretty_print)
         for inhomogeneousParam_ in self.inhomogeneousParam:
             inhomogeneousParam_.export(outfile, level, namespace_, name_='inhomogeneousParam', pretty_print=pretty_print)
-    def exportLiteral(self, outfile, level, name_='SegmentGroup'):
-        level += 1
-        already_processed = set()
-        self.exportLiteralAttributes(outfile, level, already_processed, name_)
-        if self.hasContent_():
-            self.exportLiteralChildren(outfile, level, name_)
-    def exportLiteralAttributes(self, outfile, level, already_processed, name_):
-        super(SegmentGroup, self).exportLiteralAttributes(outfile, level, already_processed, name_)
-    def exportLiteralChildren(self, outfile, level, name_):
-        super(SegmentGroup, self).exportLiteralChildren(outfile, level, name_)
-        showIndent(outfile, level)
-        outfile.write('member=[\n')
-        level += 1
-        for member_ in self.member:
-            showIndent(outfile, level)
-            outfile.write('model_.Member(\n')
-            member_.exportLiteral(outfile, level, name_='Member')
-            showIndent(outfile, level)
-            outfile.write('),\n')
-        level -= 1
-        showIndent(outfile, level)
-        outfile.write('],\n')
-        showIndent(outfile, level)
-        outfile.write('include=[\n')
-        level += 1
-        for include_ in self.include:
-            showIndent(outfile, level)
-            outfile.write('model_.Include(\n')
-            include_.exportLiteral(outfile, level, name_='Include')
-            showIndent(outfile, level)
-            outfile.write('),\n')
-        level -= 1
-        showIndent(outfile, level)
-        outfile.write('],\n')
-        showIndent(outfile, level)
-        outfile.write('path=[\n')
-        level += 1
-        for path_ in self.path:
-            showIndent(outfile, level)
-            outfile.write('model_.Path(\n')
-            path_.exportLiteral(outfile, level, name_='Path')
-            showIndent(outfile, level)
-            outfile.write('),\n')
-        level -= 1
-        showIndent(outfile, level)
-        outfile.write('],\n')
-        showIndent(outfile, level)
-        outfile.write('subTree=[\n')
-        level += 1
-        for subTree_ in self.subTree:
-            showIndent(outfile, level)
-            outfile.write('model_.SubTree(\n')
-            subTree_.exportLiteral(outfile, level, name_='SubTree')
-            showIndent(outfile, level)
-            outfile.write('),\n')
-        level -= 1
-        showIndent(outfile, level)
-        outfile.write('],\n')
-        showIndent(outfile, level)
-        outfile.write('inhomogeneousParam=[\n')
-        level += 1
-        for inhomogeneousParam_ in self.inhomogeneousParam:
-            showIndent(outfile, level)
-            outfile.write('model_.InhomogeneousParam(\n')
-            inhomogeneousParam_.exportLiteral(outfile, level, name_='InhomogeneousParam')
-            showIndent(outfile, level)
-            outfile.write('),\n')
-        level -= 1
-        showIndent(outfile, level)
-        outfile.write('],\n')
     def build(self, node):
         already_processed = set()
         self.buildAttributes(node, node.attrib, already_processed)
         for child in node:
             nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
             self.buildChildren(child, node, nodeName_)
+        return self
     def buildAttributes(self, node, attrs, already_processed):
         super(SegmentGroup, self).buildAttributes(node, attrs, already_processed)
     def buildChildren(self, child_, node, nodeName_, fromsubclass_=False):
@@ -8171,22 +7391,27 @@ class SegmentGroup(Base):
             obj_ = Member.factory()
             obj_.build(child_)
             self.member.append(obj_)
+            obj_.original_tagname_ = 'member'
         elif nodeName_ == 'include':
             obj_ = Include.factory()
             obj_.build(child_)
             self.include.append(obj_)
+            obj_.original_tagname_ = 'include'
         elif nodeName_ == 'path':
             obj_ = Path.factory()
             obj_.build(child_)
             self.path.append(obj_)
+            obj_.original_tagname_ = 'path'
         elif nodeName_ == 'subTree':
             obj_ = SubTree.factory()
             obj_.build(child_)
             self.subTree.append(obj_)
+            obj_.original_tagname_ = 'subTree'
         elif nodeName_ == 'inhomogeneousParam':
             obj_ = InhomogeneousParam.factory()
             obj_.build(child_)
             self.inhomogeneousParam.append(obj_)
+            obj_.original_tagname_ = 'inhomogeneousParam'
         super(SegmentGroup, self).buildChildren(child_, node, nodeName_, True)
 # end class SegmentGroup
 
@@ -8195,12 +7420,18 @@ class Segment(Base):
     subclass = None
     superclass = Base
     def __init__(self, id=None, neuroLexId=None, name=None, parent=None, proximal=None, distal=None):
+        self.original_tagname_ = None
         super(Segment, self).__init__(id, neuroLexId, )
         self.name = _cast(None, name)
         self.parent = parent
         self.proximal = proximal
         self.distal = distal
     def factory(*args_, **kwargs_):
+        if CurrentSubclassModule_ is not None:
+            subclass = getSubclassFromModule_(
+                CurrentSubclassModule_, Segment)
+            if subclass is not None:
+                return subclass(*args_, **kwargs_)
         if Segment.subclass:
             return Segment.subclass(*args_, **kwargs_)
         else:
@@ -8229,13 +7460,15 @@ class Segment(Base):
             eol_ = '\n'
         else:
             eol_ = ''
+        if self.original_tagname_ is not None:
+            name_ = self.original_tagname_
         showIndent(outfile, level, pretty_print)
         outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
         already_processed = set()
         self.exportAttributes(outfile, level, already_processed, namespace_, name_='Segment')
         if self.hasContent_():
             outfile.write('>%s' % (eol_, ))
-            self.exportChildren(outfile, level + 1, namespace_, name_, pretty_print=pretty_print)
+            self.exportChildren(outfile, level + 1, namespace_='', name_='Segment', pretty_print=pretty_print)
             showIndent(outfile, level, pretty_print)
             outfile.write('</%s%s>%s' % (namespace_, name_, eol_))
         else:
@@ -8244,7 +7477,7 @@ class Segment(Base):
         super(Segment, self).exportAttributes(outfile, level, already_processed, namespace_, name_='Segment')
         if self.name is not None and 'name' not in already_processed:
             already_processed.add('name')
-            outfile.write(' name=%s' % (self.gds_format_string(quote_attrib(self.name).encode(ExternalEncoding), input_name='name'), ))
+            outfile.write(' name=%s' % (self.gds_encode(self.gds_format_string(quote_attrib(self.name), input_name='name')), ))
     def exportChildren(self, outfile, level, namespace_='', name_='Segment', fromsubclass_=False, pretty_print=True):
         super(Segment, self).exportChildren(outfile, level, namespace_, name_, True, pretty_print=pretty_print)
         if pretty_print:
@@ -8257,44 +7490,13 @@ class Segment(Base):
             self.proximal.export(outfile, level, namespace_, name_='proximal', pretty_print=pretty_print)
         if self.distal is not None:
             self.distal.export(outfile, level, namespace_, name_='distal', pretty_print=pretty_print)
-    def exportLiteral(self, outfile, level, name_='Segment'):
-        level += 1
-        already_processed = set()
-        self.exportLiteralAttributes(outfile, level, already_processed, name_)
-        if self.hasContent_():
-            self.exportLiteralChildren(outfile, level, name_)
-    def exportLiteralAttributes(self, outfile, level, already_processed, name_):
-        if self.name is not None and 'name' not in already_processed:
-            already_processed.add('name')
-            showIndent(outfile, level)
-            outfile.write('name="%s",\n' % (self.name,))
-        super(Segment, self).exportLiteralAttributes(outfile, level, already_processed, name_)
-    def exportLiteralChildren(self, outfile, level, name_):
-        super(Segment, self).exportLiteralChildren(outfile, level, name_)
-        if self.parent is not None:
-            showIndent(outfile, level)
-            outfile.write('parent=model_.SegmentParent(\n')
-            self.parent.exportLiteral(outfile, level, name_='parent')
-            showIndent(outfile, level)
-            outfile.write('),\n')
-        if self.proximal is not None:
-            showIndent(outfile, level)
-            outfile.write('proximal=model_.Point3DWithDiam(\n')
-            self.proximal.exportLiteral(outfile, level, name_='proximal')
-            showIndent(outfile, level)
-            outfile.write('),\n')
-        if self.distal is not None:
-            showIndent(outfile, level)
-            outfile.write('distal=model_.Point3DWithDiam(\n')
-            self.distal.exportLiteral(outfile, level, name_='distal')
-            showIndent(outfile, level)
-            outfile.write('),\n')
     def build(self, node):
         already_processed = set()
         self.buildAttributes(node, node.attrib, already_processed)
         for child in node:
             nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
             self.buildChildren(child, node, nodeName_)
+        return self
     def buildAttributes(self, node, attrs, already_processed):
         value = find_attr_value_('name', node)
         if value is not None and 'name' not in already_processed:
@@ -8305,15 +7507,18 @@ class Segment(Base):
         if nodeName_ == 'parent':
             obj_ = SegmentParent.factory()
             obj_.build(child_)
-            self.set_parent(obj_)
+            self.parent = obj_
+            obj_.original_tagname_ = 'parent'
         elif nodeName_ == 'proximal':
             obj_ = Point3DWithDiam.factory()
             obj_.build(child_)
-            self.set_proximal(obj_)
+            self.proximal = obj_
+            obj_.original_tagname_ = 'proximal'
         elif nodeName_ == 'distal':
             obj_ = Point3DWithDiam.factory()
             obj_.build(child_)
-            self.set_distal(obj_)
+            self.distal = obj_
+            obj_.original_tagname_ = 'distal'
         super(Segment, self).buildChildren(child_, node, nodeName_, True)
 # end class Segment
 
@@ -8324,6 +7529,7 @@ class Morphology(Standalone):
     subclass = None
     superclass = Standalone
     def __init__(self, id=None, neuroLexId=None, name=None, metaid=None, notes=None, annotation=None, segment=None, segmentGroup=None):
+        self.original_tagname_ = None
         super(Morphology, self).__init__(id, neuroLexId, name, metaid, notes, annotation, )
         if segment is None:
             self.segment = []
@@ -8334,6 +7540,11 @@ class Morphology(Standalone):
         else:
             self.segmentGroup = segmentGroup
     def factory(*args_, **kwargs_):
+        if CurrentSubclassModule_ is not None:
+            subclass = getSubclassFromModule_(
+                CurrentSubclassModule_, Morphology)
+            if subclass is not None:
+                return subclass(*args_, **kwargs_)
         if Morphology.subclass:
             return Morphology.subclass(*args_, **kwargs_)
         else:
@@ -8342,11 +7553,13 @@ class Morphology(Standalone):
     def get_segment(self): return self.segment
     def set_segment(self, segment): self.segment = segment
     def add_segment(self, value): self.segment.append(value)
-    def insert_segment(self, index, value): self.segment[index] = value
+    def insert_segment_at(self, index, value): self.segment.insert(index, value)
+    def replace_segment_at(self, index, value): self.segment[index] = value
     def get_segmentGroup(self): return self.segmentGroup
     def set_segmentGroup(self, segmentGroup): self.segmentGroup = segmentGroup
     def add_segmentGroup(self, value): self.segmentGroup.append(value)
-    def insert_segmentGroup(self, index, value): self.segmentGroup[index] = value
+    def insert_segmentGroup_at(self, index, value): self.segmentGroup.insert(index, value)
+    def replace_segmentGroup_at(self, index, value): self.segmentGroup[index] = value
     def hasContent_(self):
         if (
             self.segment or
@@ -8361,13 +7574,15 @@ class Morphology(Standalone):
             eol_ = '\n'
         else:
             eol_ = ''
+        if self.original_tagname_ is not None:
+            name_ = self.original_tagname_
         showIndent(outfile, level, pretty_print)
         outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
         already_processed = set()
         self.exportAttributes(outfile, level, already_processed, namespace_, name_='Morphology')
         if self.hasContent_():
             outfile.write('>%s' % (eol_, ))
-            self.exportChildren(outfile, level + 1, namespace_, name_, pretty_print=pretty_print)
+            self.exportChildren(outfile, level + 1, namespace_='', name_='Morphology', pretty_print=pretty_print)
             showIndent(outfile, level, pretty_print)
             outfile.write('</%s%s>%s' % (namespace_, name_, eol_))
         else:
@@ -8384,46 +7599,13 @@ class Morphology(Standalone):
             segment_.export(outfile, level, namespace_, name_='segment', pretty_print=pretty_print)
         for segmentGroup_ in self.segmentGroup:
             segmentGroup_.export(outfile, level, namespace_, name_='segmentGroup', pretty_print=pretty_print)
-    def exportLiteral(self, outfile, level, name_='Morphology'):
-        level += 1
-        already_processed = set()
-        self.exportLiteralAttributes(outfile, level, already_processed, name_)
-        if self.hasContent_():
-            self.exportLiteralChildren(outfile, level, name_)
-    def exportLiteralAttributes(self, outfile, level, already_processed, name_):
-        super(Morphology, self).exportLiteralAttributes(outfile, level, already_processed, name_)
-    def exportLiteralChildren(self, outfile, level, name_):
-        super(Morphology, self).exportLiteralChildren(outfile, level, name_)
-        showIndent(outfile, level)
-        outfile.write('segment=[\n')
-        level += 1
-        for segment_ in self.segment:
-            showIndent(outfile, level)
-            outfile.write('model_.Segment(\n')
-            segment_.exportLiteral(outfile, level, name_='Segment')
-            showIndent(outfile, level)
-            outfile.write('),\n')
-        level -= 1
-        showIndent(outfile, level)
-        outfile.write('],\n')
-        showIndent(outfile, level)
-        outfile.write('segmentGroup=[\n')
-        level += 1
-        for segmentGroup_ in self.segmentGroup:
-            showIndent(outfile, level)
-            outfile.write('model_.SegmentGroup(\n')
-            segmentGroup_.exportLiteral(outfile, level, name_='SegmentGroup')
-            showIndent(outfile, level)
-            outfile.write('),\n')
-        level -= 1
-        showIndent(outfile, level)
-        outfile.write('],\n')
     def build(self, node):
         already_processed = set()
         self.buildAttributes(node, node.attrib, already_processed)
         for child in node:
             nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
             self.buildChildren(child, node, nodeName_)
+        return self
     def buildAttributes(self, node, attrs, already_processed):
         super(Morphology, self).buildAttributes(node, attrs, already_processed)
     def buildChildren(self, child_, node, nodeName_, fromsubclass_=False):
@@ -8431,10 +7613,12 @@ class Morphology(Standalone):
             obj_ = Segment.factory()
             obj_.build(child_)
             self.segment.append(obj_)
+            obj_.original_tagname_ = 'segment'
         elif nodeName_ == 'segmentGroup':
             obj_ = SegmentGroup.factory()
             obj_.build(child_)
             self.segmentGroup.append(obj_)
+            obj_.original_tagname_ = 'segmentGroup'
         super(Morphology, self).buildChildren(child_, node, nodeName_, True)
 # end class Morphology
 
@@ -8443,9 +7627,15 @@ class BaseCell(Standalone):
     subclass = None
     superclass = Standalone
     def __init__(self, id=None, neuroLexId=None, name=None, metaid=None, notes=None, annotation=None, extensiontype_=None):
+        self.original_tagname_ = None
         super(BaseCell, self).__init__(id, neuroLexId, name, metaid, notes, annotation, extensiontype_, )
         self.extensiontype_ = extensiontype_
     def factory(*args_, **kwargs_):
+        if CurrentSubclassModule_ is not None:
+            subclass = getSubclassFromModule_(
+                CurrentSubclassModule_, BaseCell)
+            if subclass is not None:
+                return subclass(*args_, **kwargs_)
         if BaseCell.subclass:
             return BaseCell.subclass(*args_, **kwargs_)
         else:
@@ -8465,13 +7655,15 @@ class BaseCell(Standalone):
             eol_ = '\n'
         else:
             eol_ = ''
+        if self.original_tagname_ is not None:
+            name_ = self.original_tagname_
         showIndent(outfile, level, pretty_print)
         outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
         already_processed = set()
         self.exportAttributes(outfile, level, already_processed, namespace_, name_='BaseCell')
         if self.hasContent_():
             outfile.write('>%s' % (eol_, ))
-            self.exportChildren(outfile, level + 1, namespace_, name_, pretty_print=pretty_print)
+            self.exportChildren(outfile, level + 1, namespace_='', name_='BaseCell', pretty_print=pretty_print)
             showIndent(outfile, level, pretty_print)
             outfile.write('</%s%s>%s' % (namespace_, name_, eol_))
         else:
@@ -8484,22 +7676,13 @@ class BaseCell(Standalone):
             outfile.write(' xsi:type="%s"' % self.extensiontype_)
     def exportChildren(self, outfile, level, namespace_='', name_='BaseCell', fromsubclass_=False, pretty_print=True):
         super(BaseCell, self).exportChildren(outfile, level, namespace_, name_, True, pretty_print=pretty_print)
-    def exportLiteral(self, outfile, level, name_='BaseCell'):
-        level += 1
-        already_processed = set()
-        self.exportLiteralAttributes(outfile, level, already_processed, name_)
-        if self.hasContent_():
-            self.exportLiteralChildren(outfile, level, name_)
-    def exportLiteralAttributes(self, outfile, level, already_processed, name_):
-        super(BaseCell, self).exportLiteralAttributes(outfile, level, already_processed, name_)
-    def exportLiteralChildren(self, outfile, level, name_):
-        super(BaseCell, self).exportLiteralChildren(outfile, level, name_)
     def build(self, node):
         already_processed = set()
         self.buildAttributes(node, node.attrib, already_processed)
         for child in node:
             nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
             self.buildChildren(child, node, nodeName_)
+        return self
     def buildAttributes(self, node, attrs, already_processed):
         value = find_attr_value_('xsi:type', node)
         if value is not None and 'xsi:type' not in already_processed:
@@ -8516,9 +7699,15 @@ class BaseSynapse(Standalone):
     subclass = None
     superclass = Standalone
     def __init__(self, id=None, neuroLexId=None, name=None, metaid=None, notes=None, annotation=None, extensiontype_=None):
+        self.original_tagname_ = None
         super(BaseSynapse, self).__init__(id, neuroLexId, name, metaid, notes, annotation, extensiontype_, )
         self.extensiontype_ = extensiontype_
     def factory(*args_, **kwargs_):
+        if CurrentSubclassModule_ is not None:
+            subclass = getSubclassFromModule_(
+                CurrentSubclassModule_, BaseSynapse)
+            if subclass is not None:
+                return subclass(*args_, **kwargs_)
         if BaseSynapse.subclass:
             return BaseSynapse.subclass(*args_, **kwargs_)
         else:
@@ -8538,13 +7727,15 @@ class BaseSynapse(Standalone):
             eol_ = '\n'
         else:
             eol_ = ''
+        if self.original_tagname_ is not None:
+            name_ = self.original_tagname_
         showIndent(outfile, level, pretty_print)
         outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
         already_processed = set()
         self.exportAttributes(outfile, level, already_processed, namespace_, name_='BaseSynapse')
         if self.hasContent_():
             outfile.write('>%s' % (eol_, ))
-            self.exportChildren(outfile, level + 1, namespace_, name_, pretty_print=pretty_print)
+            self.exportChildren(outfile, level + 1, namespace_='', name_='BaseSynapse', pretty_print=pretty_print)
             showIndent(outfile, level, pretty_print)
             outfile.write('</%s%s>%s' % (namespace_, name_, eol_))
         else:
@@ -8557,22 +7748,13 @@ class BaseSynapse(Standalone):
             outfile.write(' xsi:type="%s"' % self.extensiontype_)
     def exportChildren(self, outfile, level, namespace_='', name_='BaseSynapse', fromsubclass_=False, pretty_print=True):
         super(BaseSynapse, self).exportChildren(outfile, level, namespace_, name_, True, pretty_print=pretty_print)
-    def exportLiteral(self, outfile, level, name_='BaseSynapse'):
-        level += 1
-        already_processed = set()
-        self.exportLiteralAttributes(outfile, level, already_processed, name_)
-        if self.hasContent_():
-            self.exportLiteralChildren(outfile, level, name_)
-    def exportLiteralAttributes(self, outfile, level, already_processed, name_):
-        super(BaseSynapse, self).exportLiteralAttributes(outfile, level, already_processed, name_)
-    def exportLiteralChildren(self, outfile, level, name_):
-        super(BaseSynapse, self).exportLiteralChildren(outfile, level, name_)
     def build(self, node):
         already_processed = set()
         self.buildAttributes(node, node.attrib, already_processed)
         for child in node:
             nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
             self.buildChildren(child, node, nodeName_)
+        return self
     def buildAttributes(self, node, attrs, already_processed):
         value = find_attr_value_('xsi:type', node)
         if value is not None and 'xsi:type' not in already_processed:
@@ -8589,14 +7771,20 @@ class DecayingPoolConcentrationModel(Standalone):
     """Should not be required, as it's present on the species element!"""
     subclass = None
     superclass = Standalone
-    def __init__(self, id=None, neuroLexId=None, name=None, metaid=None, notes=None, annotation=None, ion=None, shellThickness=None, restingConc=None, decayConstant=None, extensiontype_=None):
+    def __init__(self, id=None, neuroLexId=None, name=None, metaid=None, notes=None, annotation=None, ion=None, restingConc=None, decayConstant=None, shellThickness=None, extensiontype_=None):
+        self.original_tagname_ = None
         super(DecayingPoolConcentrationModel, self).__init__(id, neuroLexId, name, metaid, notes, annotation, extensiontype_, )
         self.ion = _cast(None, ion)
-        self.shellThickness = _cast(None, shellThickness)
         self.restingConc = _cast(None, restingConc)
         self.decayConstant = _cast(None, decayConstant)
+        self.shellThickness = _cast(None, shellThickness)
         self.extensiontype_ = extensiontype_
     def factory(*args_, **kwargs_):
+        if CurrentSubclassModule_ is not None:
+            subclass = getSubclassFromModule_(
+                CurrentSubclassModule_, DecayingPoolConcentrationModel)
+            if subclass is not None:
+                return subclass(*args_, **kwargs_)
         if DecayingPoolConcentrationModel.subclass:
             return DecayingPoolConcentrationModel.subclass(*args_, **kwargs_)
         else:
@@ -8604,26 +7792,42 @@ class DecayingPoolConcentrationModel(Standalone):
     factory = staticmethod(factory)
     def get_ion(self): return self.ion
     def set_ion(self, ion): self.ion = ion
-    def validate_NmlId(self, value):
-        # Validate type NmlId, a restriction on xs:string.
-        pass
-    def get_shellThickness(self): return self.shellThickness
-    def set_shellThickness(self, shellThickness): self.shellThickness = shellThickness
-    def validate_Nml2Quantity_length(self, value):
-        # Validate type Nml2Quantity_length, a restriction on xs:string.
-        pass
     def get_restingConc(self): return self.restingConc
     def set_restingConc(self, restingConc): self.restingConc = restingConc
-    def validate_Nml2Quantity_concentration(self, value):
-        # Validate type Nml2Quantity_concentration, a restriction on xs:string.
-        pass
     def get_decayConstant(self): return self.decayConstant
     def set_decayConstant(self, decayConstant): self.decayConstant = decayConstant
-    def validate_Nml2Quantity_time(self, value):
-        # Validate type Nml2Quantity_time, a restriction on xs:string.
-        pass
+    def get_shellThickness(self): return self.shellThickness
+    def set_shellThickness(self, shellThickness): self.shellThickness = shellThickness
     def get_extensiontype_(self): return self.extensiontype_
     def set_extensiontype_(self, extensiontype_): self.extensiontype_ = extensiontype_
+    def validate_NmlId(self, value):
+        # Validate type NmlId, a restriction on xs:string.
+        if value is not None and Validate_simpletypes_:
+            if not self.gds_validate_simple_patterns(
+                    self.validate_NmlId_patterns_, value):
+                warnings_.warn('Value "%s" does not match xsd pattern restrictions: %s' % (value.encode('utf-8'), self.validate_NmlId_patterns_, ))
+    validate_NmlId_patterns_ = [['^[a-zA-Z0-9_]*$']]
+    def validate_Nml2Quantity_concentration(self, value):
+        # Validate type Nml2Quantity_concentration, a restriction on xs:string.
+        if value is not None and Validate_simpletypes_:
+            if not self.gds_validate_simple_patterns(
+                    self.validate_Nml2Quantity_concentration_patterns_, value):
+                warnings_.warn('Value "%s" does not match xsd pattern restrictions: %s' % (value.encode('utf-8'), self.validate_Nml2Quantity_concentration_patterns_, ))
+    validate_Nml2Quantity_concentration_patterns_ = [['^-?([0-9]*(\\.[0-9]+)?)([eE]-?[0-9]+)?[\\s]*(mol_per_m3|mol_per_cm3|M|mM)$']]
+    def validate_Nml2Quantity_time(self, value):
+        # Validate type Nml2Quantity_time, a restriction on xs:string.
+        if value is not None and Validate_simpletypes_:
+            if not self.gds_validate_simple_patterns(
+                    self.validate_Nml2Quantity_time_patterns_, value):
+                warnings_.warn('Value "%s" does not match xsd pattern restrictions: %s' % (value.encode('utf-8'), self.validate_Nml2Quantity_time_patterns_, ))
+    validate_Nml2Quantity_time_patterns_ = [['^-?([0-9]*(\\.[0-9]+)?)([eE]-?[0-9]+)?[\\s]*(s|ms)$']]
+    def validate_Nml2Quantity_length(self, value):
+        # Validate type Nml2Quantity_length, a restriction on xs:string.
+        if value is not None and Validate_simpletypes_:
+            if not self.gds_validate_simple_patterns(
+                    self.validate_Nml2Quantity_length_patterns_, value):
+                warnings_.warn('Value "%s" does not match xsd pattern restrictions: %s' % (value.encode('utf-8'), self.validate_Nml2Quantity_length_patterns_, ))
+    validate_Nml2Quantity_length_patterns_ = [['^-?([0-9]*(\\.[0-9]+)?)([eE]-?[0-9]+)?[\\s]*(m|cm|um)$']]
     def hasContent_(self):
         if (
             super(DecayingPoolConcentrationModel, self).hasContent_()
@@ -8636,13 +7840,15 @@ class DecayingPoolConcentrationModel(Standalone):
             eol_ = '\n'
         else:
             eol_ = ''
+        if self.original_tagname_ is not None:
+            name_ = self.original_tagname_
         showIndent(outfile, level, pretty_print)
         outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
         already_processed = set()
         self.exportAttributes(outfile, level, already_processed, namespace_, name_='DecayingPoolConcentrationModel')
         if self.hasContent_():
             outfile.write('>%s' % (eol_, ))
-            self.exportChildren(outfile, level + 1, namespace_, name_, pretty_print=pretty_print)
+            self.exportChildren(outfile, level + 1, namespace_='', name_='DecayingPoolConcentrationModel', pretty_print=pretty_print)
             showIndent(outfile, level, pretty_print)
             outfile.write('</%s%s>%s' % (namespace_, name_, eol_))
         else:
@@ -8652,64 +7858,34 @@ class DecayingPoolConcentrationModel(Standalone):
         if self.ion is not None and 'ion' not in already_processed:
             already_processed.add('ion')
             outfile.write(' ion=%s' % (quote_attrib(self.ion), ))
-        if self.shellThickness is not None and 'shellThickness' not in already_processed:
-            already_processed.add('shellThickness')
-            outfile.write(' shellThickness=%s' % (quote_attrib(self.shellThickness), ))
         if self.restingConc is not None and 'restingConc' not in already_processed:
             already_processed.add('restingConc')
             outfile.write(' restingConc=%s' % (quote_attrib(self.restingConc), ))
         if self.decayConstant is not None and 'decayConstant' not in already_processed:
             already_processed.add('decayConstant')
             outfile.write(' decayConstant=%s' % (quote_attrib(self.decayConstant), ))
+        if self.shellThickness is not None and 'shellThickness' not in already_processed:
+            already_processed.add('shellThickness')
+            outfile.write(' shellThickness=%s' % (quote_attrib(self.shellThickness), ))
         if self.extensiontype_ is not None and 'xsi:type' not in already_processed:
             already_processed.add('xsi:type')
             outfile.write(' xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"')
             outfile.write(' xsi:type="%s"' % self.extensiontype_)
     def exportChildren(self, outfile, level, namespace_='', name_='DecayingPoolConcentrationModel', fromsubclass_=False, pretty_print=True):
         super(DecayingPoolConcentrationModel, self).exportChildren(outfile, level, namespace_, name_, True, pretty_print=pretty_print)
-    def exportLiteral(self, outfile, level, name_='DecayingPoolConcentrationModel'):
-        level += 1
-        already_processed = set()
-        self.exportLiteralAttributes(outfile, level, already_processed, name_)
-        if self.hasContent_():
-            self.exportLiteralChildren(outfile, level, name_)
-    def exportLiteralAttributes(self, outfile, level, already_processed, name_):
-        if self.ion is not None and 'ion' not in already_processed:
-            already_processed.add('ion')
-            showIndent(outfile, level)
-            outfile.write('ion="%s",\n' % (self.ion,))
-        if self.shellThickness is not None and 'shellThickness' not in already_processed:
-            already_processed.add('shellThickness')
-            showIndent(outfile, level)
-            outfile.write('shellThickness="%s",\n' % (self.shellThickness,))
-        if self.restingConc is not None and 'restingConc' not in already_processed:
-            already_processed.add('restingConc')
-            showIndent(outfile, level)
-            outfile.write('restingConc="%s",\n' % (self.restingConc,))
-        if self.decayConstant is not None and 'decayConstant' not in already_processed:
-            already_processed.add('decayConstant')
-            showIndent(outfile, level)
-            outfile.write('decayConstant="%s",\n' % (self.decayConstant,))
-        super(DecayingPoolConcentrationModel, self).exportLiteralAttributes(outfile, level, already_processed, name_)
-    def exportLiteralChildren(self, outfile, level, name_):
-        super(DecayingPoolConcentrationModel, self).exportLiteralChildren(outfile, level, name_)
     def build(self, node):
         already_processed = set()
         self.buildAttributes(node, node.attrib, already_processed)
         for child in node:
             nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
             self.buildChildren(child, node, nodeName_)
+        return self
     def buildAttributes(self, node, attrs, already_processed):
         value = find_attr_value_('ion', node)
         if value is not None and 'ion' not in already_processed:
             already_processed.add('ion')
             self.ion = value
             self.validate_NmlId(self.ion)    # validate type NmlId
-        value = find_attr_value_('shellThickness', node)
-        if value is not None and 'shellThickness' not in already_processed:
-            already_processed.add('shellThickness')
-            self.shellThickness = value
-            self.validate_Nml2Quantity_length(self.shellThickness)    # validate type Nml2Quantity_length
         value = find_attr_value_('restingConc', node)
         if value is not None and 'restingConc' not in already_processed:
             already_processed.add('restingConc')
@@ -8720,6 +7896,11 @@ class DecayingPoolConcentrationModel(Standalone):
             already_processed.add('decayConstant')
             self.decayConstant = value
             self.validate_Nml2Quantity_time(self.decayConstant)    # validate type Nml2Quantity_time
+        value = find_attr_value_('shellThickness', node)
+        if value is not None and 'shellThickness' not in already_processed:
+            already_processed.add('shellThickness')
+            self.shellThickness = value
+            self.validate_Nml2Quantity_length(self.shellThickness)    # validate type Nml2Quantity_length
         value = find_attr_value_('xsi:type', node)
         if value is not None and 'xsi:type' not in already_processed:
             already_processed.add('xsi:type')
@@ -8735,15 +7916,22 @@ class GateHHRatesInf(Base):
     subclass = None
     superclass = Base
     def __init__(self, id=None, neuroLexId=None, instances=1, type_=None, notes=None, q10Settings=None, forwardRate=None, reverseRate=None, steadyState=None):
+        self.original_tagname_ = None
         super(GateHHRatesInf, self).__init__(id, neuroLexId, )
         self.instances = _cast(int, instances)
         self.type_ = _cast(None, type_)
         self.notes = notes
+        self.validate_Notes(self.notes)
         self.q10Settings = q10Settings
         self.forwardRate = forwardRate
         self.reverseRate = reverseRate
         self.steadyState = steadyState
     def factory(*args_, **kwargs_):
+        if CurrentSubclassModule_ is not None:
+            subclass = getSubclassFromModule_(
+                CurrentSubclassModule_, GateHHRatesInf)
+            if subclass is not None:
+                return subclass(*args_, **kwargs_)
         if GateHHRatesInf.subclass:
             return GateHHRatesInf.subclass(*args_, **kwargs_)
         else:
@@ -8751,9 +7939,6 @@ class GateHHRatesInf(Base):
     factory = staticmethod(factory)
     def get_notes(self): return self.notes
     def set_notes(self, notes): self.notes = notes
-    def validate_Notes(self, value):
-        # Validate type Notes, a restriction on xs:string.
-        pass
     def get_q10Settings(self): return self.q10Settings
     def set_q10Settings(self, q10Settings): self.q10Settings = q10Settings
     def get_forwardRate(self): return self.forwardRate
@@ -8766,9 +7951,22 @@ class GateHHRatesInf(Base):
     def set_instances(self, instances): self.instances = instances
     def get_type(self): return self.type_
     def set_type(self, type_): self.type_ = type_
+    def validate_Notes(self, value):
+        # Validate type Notes, a restriction on xs:string.
+        if value is not None and Validate_simpletypes_:
+            pass
     def validate_gateTypes(self, value):
         # Validate type gateTypes, a restriction on xs:string.
-        pass
+        if value is not None and Validate_simpletypes_:
+            value = str(value)
+            enumerations = ['gateHHrates', 'gateHHratesTau', 'gateHHtauInf', 'gateHHratesInf', 'gateKS']
+            enumeration_respectee = False
+            for enum in enumerations:
+                if value == enum:
+                    enumeration_respectee = True
+                    break
+            if not enumeration_respectee:
+                warnings_.warn('Value "%(value)s" does not match xsd enumeration restriction on gateTypes' % {"value" : value.encode("utf-8")} )
     def hasContent_(self):
         if (
             self.notes is not None or
@@ -8786,20 +7984,22 @@ class GateHHRatesInf(Base):
             eol_ = '\n'
         else:
             eol_ = ''
+        if self.original_tagname_ is not None:
+            name_ = self.original_tagname_
         showIndent(outfile, level, pretty_print)
         outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
         already_processed = set()
         self.exportAttributes(outfile, level, already_processed, namespace_, name_='GateHHRatesInf')
         if self.hasContent_():
             outfile.write('>%s' % (eol_, ))
-            self.exportChildren(outfile, level + 1, namespace_, name_, pretty_print=pretty_print)
+            self.exportChildren(outfile, level + 1, namespace_='', name_='GateHHRatesInf', pretty_print=pretty_print)
             showIndent(outfile, level, pretty_print)
             outfile.write('</%s%s>%s' % (namespace_, name_, eol_))
         else:
             outfile.write('/>%s' % (eol_, ))
     def exportAttributes(self, outfile, level, already_processed, namespace_='', name_='GateHHRatesInf'):
         super(GateHHRatesInf, self).exportAttributes(outfile, level, already_processed, namespace_, name_='GateHHRatesInf')
-        if self.instances is not None and 'instances' not in already_processed:
+        if self.instances != 1 and 'instances' not in already_processed:
             already_processed.add('instances')
             outfile.write(' instances="%s"' % self.gds_format_integer(self.instances, input_name='instances'))
         if self.type_ is not None and 'type_' not in already_processed:
@@ -8813,7 +8013,7 @@ class GateHHRatesInf(Base):
             eol_ = ''
         if self.notes is not None:
             showIndent(outfile, level, pretty_print)
-            outfile.write('<%snotes>%s</%snotes>%s' % (namespace_, self.gds_format_string(quote_xml(self.notes).encode(ExternalEncoding), input_name='notes'), namespace_, eol_))
+            outfile.write('<%snotes>%s</%snotes>%s' % (namespace_, self.gds_encode(self.gds_format_string(quote_xml(self.notes), input_name='notes')), namespace_, eol_))
         if self.q10Settings is not None:
             self.q10Settings.export(outfile, level, namespace_, name_='q10Settings', pretty_print=pretty_print)
         if self.forwardRate is not None:
@@ -8822,57 +8022,13 @@ class GateHHRatesInf(Base):
             self.reverseRate.export(outfile, level, namespace_, name_='reverseRate', pretty_print=pretty_print)
         if self.steadyState is not None:
             self.steadyState.export(outfile, level, namespace_, name_='steadyState', pretty_print=pretty_print)
-    def exportLiteral(self, outfile, level, name_='GateHHRatesInf'):
-        level += 1
-        already_processed = set()
-        self.exportLiteralAttributes(outfile, level, already_processed, name_)
-        if self.hasContent_():
-            self.exportLiteralChildren(outfile, level, name_)
-    def exportLiteralAttributes(self, outfile, level, already_processed, name_):
-        if self.instances is not None and 'instances' not in already_processed:
-            already_processed.add('instances')
-            showIndent(outfile, level)
-            outfile.write('instances=%d,\n' % (self.instances,))
-        if self.type_ is not None and 'type_' not in already_processed:
-            already_processed.add('type_')
-            showIndent(outfile, level)
-            outfile.write('type_="%s",\n' % (self.type_,))
-        super(GateHHRatesInf, self).exportLiteralAttributes(outfile, level, already_processed, name_)
-    def exportLiteralChildren(self, outfile, level, name_):
-        super(GateHHRatesInf, self).exportLiteralChildren(outfile, level, name_)
-        if self.notes is not None:
-            showIndent(outfile, level)
-            outfile.write('notes=%s,\n' % quote_python(self.notes).encode(ExternalEncoding))
-        if self.q10Settings is not None:
-            showIndent(outfile, level)
-            outfile.write('q10Settings=model_.Q10Settings(\n')
-            self.q10Settings.exportLiteral(outfile, level, name_='q10Settings')
-            showIndent(outfile, level)
-            outfile.write('),\n')
-        if self.forwardRate is not None:
-            showIndent(outfile, level)
-            outfile.write('forwardRate=model_.HHRate(\n')
-            self.forwardRate.exportLiteral(outfile, level, name_='forwardRate')
-            showIndent(outfile, level)
-            outfile.write('),\n')
-        if self.reverseRate is not None:
-            showIndent(outfile, level)
-            outfile.write('reverseRate=model_.HHRate(\n')
-            self.reverseRate.exportLiteral(outfile, level, name_='reverseRate')
-            showIndent(outfile, level)
-            outfile.write('),\n')
-        if self.steadyState is not None:
-            showIndent(outfile, level)
-            outfile.write('steadyState=model_.HHVariable(\n')
-            self.steadyState.exportLiteral(outfile, level, name_='steadyState')
-            showIndent(outfile, level)
-            outfile.write('),\n')
     def build(self, node):
         already_processed = set()
         self.buildAttributes(node, node.attrib, already_processed)
         for child in node:
             nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
             self.buildChildren(child, node, nodeName_)
+        return self
     def buildAttributes(self, node, attrs, already_processed):
         value = find_attr_value_('instances', node)
         if value is not None and 'instances' not in already_processed:
@@ -8892,23 +8048,28 @@ class GateHHRatesInf(Base):
             notes_ = child_.text
             notes_ = self.gds_validate_string(notes_, node, 'notes')
             self.notes = notes_
-            self.validate_Notes(self.notes)    # validate type Notes
+            # validate type Notes
+            self.validate_Notes(self.notes)
         elif nodeName_ == 'q10Settings':
             obj_ = Q10Settings.factory()
             obj_.build(child_)
-            self.set_q10Settings(obj_)
+            self.q10Settings = obj_
+            obj_.original_tagname_ = 'q10Settings'
         elif nodeName_ == 'forwardRate':
             obj_ = HHRate.factory()
             obj_.build(child_)
-            self.set_forwardRate(obj_)
+            self.forwardRate = obj_
+            obj_.original_tagname_ = 'forwardRate'
         elif nodeName_ == 'reverseRate':
             obj_ = HHRate.factory()
             obj_.build(child_)
-            self.set_reverseRate(obj_)
+            self.reverseRate = obj_
+            obj_.original_tagname_ = 'reverseRate'
         elif nodeName_ == 'steadyState':
             obj_ = HHVariable.factory()
             obj_.build(child_)
-            self.set_steadyState(obj_)
+            self.steadyState = obj_
+            obj_.original_tagname_ = 'steadyState'
         super(GateHHRatesInf, self).buildChildren(child_, node, nodeName_, True)
 # end class GateHHRatesInf
 
@@ -8917,15 +8078,22 @@ class GateHHRatesTau(Base):
     subclass = None
     superclass = Base
     def __init__(self, id=None, neuroLexId=None, instances=1, type_=None, notes=None, q10Settings=None, forwardRate=None, reverseRate=None, timeCourse=None):
+        self.original_tagname_ = None
         super(GateHHRatesTau, self).__init__(id, neuroLexId, )
         self.instances = _cast(int, instances)
         self.type_ = _cast(None, type_)
         self.notes = notes
+        self.validate_Notes(self.notes)
         self.q10Settings = q10Settings
         self.forwardRate = forwardRate
         self.reverseRate = reverseRate
         self.timeCourse = timeCourse
     def factory(*args_, **kwargs_):
+        if CurrentSubclassModule_ is not None:
+            subclass = getSubclassFromModule_(
+                CurrentSubclassModule_, GateHHRatesTau)
+            if subclass is not None:
+                return subclass(*args_, **kwargs_)
         if GateHHRatesTau.subclass:
             return GateHHRatesTau.subclass(*args_, **kwargs_)
         else:
@@ -8933,9 +8101,6 @@ class GateHHRatesTau(Base):
     factory = staticmethod(factory)
     def get_notes(self): return self.notes
     def set_notes(self, notes): self.notes = notes
-    def validate_Notes(self, value):
-        # Validate type Notes, a restriction on xs:string.
-        pass
     def get_q10Settings(self): return self.q10Settings
     def set_q10Settings(self, q10Settings): self.q10Settings = q10Settings
     def get_forwardRate(self): return self.forwardRate
@@ -8948,9 +8113,22 @@ class GateHHRatesTau(Base):
     def set_instances(self, instances): self.instances = instances
     def get_type(self): return self.type_
     def set_type(self, type_): self.type_ = type_
+    def validate_Notes(self, value):
+        # Validate type Notes, a restriction on xs:string.
+        if value is not None and Validate_simpletypes_:
+            pass
     def validate_gateTypes(self, value):
         # Validate type gateTypes, a restriction on xs:string.
-        pass
+        if value is not None and Validate_simpletypes_:
+            value = str(value)
+            enumerations = ['gateHHrates', 'gateHHratesTau', 'gateHHtauInf', 'gateHHratesInf', 'gateKS']
+            enumeration_respectee = False
+            for enum in enumerations:
+                if value == enum:
+                    enumeration_respectee = True
+                    break
+            if not enumeration_respectee:
+                warnings_.warn('Value "%(value)s" does not match xsd enumeration restriction on gateTypes' % {"value" : value.encode("utf-8")} )
     def hasContent_(self):
         if (
             self.notes is not None or
@@ -8968,20 +8146,22 @@ class GateHHRatesTau(Base):
             eol_ = '\n'
         else:
             eol_ = ''
+        if self.original_tagname_ is not None:
+            name_ = self.original_tagname_
         showIndent(outfile, level, pretty_print)
         outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
         already_processed = set()
         self.exportAttributes(outfile, level, already_processed, namespace_, name_='GateHHRatesTau')
         if self.hasContent_():
             outfile.write('>%s' % (eol_, ))
-            self.exportChildren(outfile, level + 1, namespace_, name_, pretty_print=pretty_print)
+            self.exportChildren(outfile, level + 1, namespace_='', name_='GateHHRatesTau', pretty_print=pretty_print)
             showIndent(outfile, level, pretty_print)
             outfile.write('</%s%s>%s' % (namespace_, name_, eol_))
         else:
             outfile.write('/>%s' % (eol_, ))
     def exportAttributes(self, outfile, level, already_processed, namespace_='', name_='GateHHRatesTau'):
         super(GateHHRatesTau, self).exportAttributes(outfile, level, already_processed, namespace_, name_='GateHHRatesTau')
-        if self.instances is not None and 'instances' not in already_processed:
+        if self.instances != 1 and 'instances' not in already_processed:
             already_processed.add('instances')
             outfile.write(' instances="%s"' % self.gds_format_integer(self.instances, input_name='instances'))
         if self.type_ is not None and 'type_' not in already_processed:
@@ -8995,7 +8175,7 @@ class GateHHRatesTau(Base):
             eol_ = ''
         if self.notes is not None:
             showIndent(outfile, level, pretty_print)
-            outfile.write('<%snotes>%s</%snotes>%s' % (namespace_, self.gds_format_string(quote_xml(self.notes).encode(ExternalEncoding), input_name='notes'), namespace_, eol_))
+            outfile.write('<%snotes>%s</%snotes>%s' % (namespace_, self.gds_encode(self.gds_format_string(quote_xml(self.notes), input_name='notes')), namespace_, eol_))
         if self.q10Settings is not None:
             self.q10Settings.export(outfile, level, namespace_, name_='q10Settings', pretty_print=pretty_print)
         if self.forwardRate is not None:
@@ -9004,57 +8184,13 @@ class GateHHRatesTau(Base):
             self.reverseRate.export(outfile, level, namespace_, name_='reverseRate', pretty_print=pretty_print)
         if self.timeCourse is not None:
             self.timeCourse.export(outfile, level, namespace_, name_='timeCourse', pretty_print=pretty_print)
-    def exportLiteral(self, outfile, level, name_='GateHHRatesTau'):
-        level += 1
-        already_processed = set()
-        self.exportLiteralAttributes(outfile, level, already_processed, name_)
-        if self.hasContent_():
-            self.exportLiteralChildren(outfile, level, name_)
-    def exportLiteralAttributes(self, outfile, level, already_processed, name_):
-        if self.instances is not None and 'instances' not in already_processed:
-            already_processed.add('instances')
-            showIndent(outfile, level)
-            outfile.write('instances=%d,\n' % (self.instances,))
-        if self.type_ is not None and 'type_' not in already_processed:
-            already_processed.add('type_')
-            showIndent(outfile, level)
-            outfile.write('type_="%s",\n' % (self.type_,))
-        super(GateHHRatesTau, self).exportLiteralAttributes(outfile, level, already_processed, name_)
-    def exportLiteralChildren(self, outfile, level, name_):
-        super(GateHHRatesTau, self).exportLiteralChildren(outfile, level, name_)
-        if self.notes is not None:
-            showIndent(outfile, level)
-            outfile.write('notes=%s,\n' % quote_python(self.notes).encode(ExternalEncoding))
-        if self.q10Settings is not None:
-            showIndent(outfile, level)
-            outfile.write('q10Settings=model_.Q10Settings(\n')
-            self.q10Settings.exportLiteral(outfile, level, name_='q10Settings')
-            showIndent(outfile, level)
-            outfile.write('),\n')
-        if self.forwardRate is not None:
-            showIndent(outfile, level)
-            outfile.write('forwardRate=model_.HHRate(\n')
-            self.forwardRate.exportLiteral(outfile, level, name_='forwardRate')
-            showIndent(outfile, level)
-            outfile.write('),\n')
-        if self.reverseRate is not None:
-            showIndent(outfile, level)
-            outfile.write('reverseRate=model_.HHRate(\n')
-            self.reverseRate.exportLiteral(outfile, level, name_='reverseRate')
-            showIndent(outfile, level)
-            outfile.write('),\n')
-        if self.timeCourse is not None:
-            showIndent(outfile, level)
-            outfile.write('timeCourse=model_.HHTime(\n')
-            self.timeCourse.exportLiteral(outfile, level, name_='timeCourse')
-            showIndent(outfile, level)
-            outfile.write('),\n')
     def build(self, node):
         already_processed = set()
         self.buildAttributes(node, node.attrib, already_processed)
         for child in node:
             nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
             self.buildChildren(child, node, nodeName_)
+        return self
     def buildAttributes(self, node, attrs, already_processed):
         value = find_attr_value_('instances', node)
         if value is not None and 'instances' not in already_processed:
@@ -9074,23 +8210,28 @@ class GateHHRatesTau(Base):
             notes_ = child_.text
             notes_ = self.gds_validate_string(notes_, node, 'notes')
             self.notes = notes_
-            self.validate_Notes(self.notes)    # validate type Notes
+            # validate type Notes
+            self.validate_Notes(self.notes)
         elif nodeName_ == 'q10Settings':
             obj_ = Q10Settings.factory()
             obj_.build(child_)
-            self.set_q10Settings(obj_)
+            self.q10Settings = obj_
+            obj_.original_tagname_ = 'q10Settings'
         elif nodeName_ == 'forwardRate':
             obj_ = HHRate.factory()
             obj_.build(child_)
-            self.set_forwardRate(obj_)
+            self.forwardRate = obj_
+            obj_.original_tagname_ = 'forwardRate'
         elif nodeName_ == 'reverseRate':
             obj_ = HHRate.factory()
             obj_.build(child_)
-            self.set_reverseRate(obj_)
+            self.reverseRate = obj_
+            obj_.original_tagname_ = 'reverseRate'
         elif nodeName_ == 'timeCourse':
             obj_ = HHTime.factory()
             obj_.build(child_)
-            self.set_timeCourse(obj_)
+            self.timeCourse = obj_
+            obj_.original_tagname_ = 'timeCourse'
         super(GateHHRatesTau, self).buildChildren(child_, node, nodeName_, True)
 # end class GateHHRatesTau
 
@@ -9099,14 +8240,21 @@ class GateHHTauInf(Base):
     subclass = None
     superclass = Base
     def __init__(self, id=None, neuroLexId=None, instances=1, type_=None, notes=None, q10Settings=None, timeCourse=None, steadyState=None):
+        self.original_tagname_ = None
         super(GateHHTauInf, self).__init__(id, neuroLexId, )
         self.instances = _cast(int, instances)
         self.type_ = _cast(None, type_)
         self.notes = notes
+        self.validate_Notes(self.notes)
         self.q10Settings = q10Settings
         self.timeCourse = timeCourse
         self.steadyState = steadyState
     def factory(*args_, **kwargs_):
+        if CurrentSubclassModule_ is not None:
+            subclass = getSubclassFromModule_(
+                CurrentSubclassModule_, GateHHTauInf)
+            if subclass is not None:
+                return subclass(*args_, **kwargs_)
         if GateHHTauInf.subclass:
             return GateHHTauInf.subclass(*args_, **kwargs_)
         else:
@@ -9114,9 +8262,6 @@ class GateHHTauInf(Base):
     factory = staticmethod(factory)
     def get_notes(self): return self.notes
     def set_notes(self, notes): self.notes = notes
-    def validate_Notes(self, value):
-        # Validate type Notes, a restriction on xs:string.
-        pass
     def get_q10Settings(self): return self.q10Settings
     def set_q10Settings(self, q10Settings): self.q10Settings = q10Settings
     def get_timeCourse(self): return self.timeCourse
@@ -9127,9 +8272,22 @@ class GateHHTauInf(Base):
     def set_instances(self, instances): self.instances = instances
     def get_type(self): return self.type_
     def set_type(self, type_): self.type_ = type_
+    def validate_Notes(self, value):
+        # Validate type Notes, a restriction on xs:string.
+        if value is not None and Validate_simpletypes_:
+            pass
     def validate_gateTypes(self, value):
         # Validate type gateTypes, a restriction on xs:string.
-        pass
+        if value is not None and Validate_simpletypes_:
+            value = str(value)
+            enumerations = ['gateHHrates', 'gateHHratesTau', 'gateHHtauInf', 'gateHHratesInf', 'gateKS']
+            enumeration_respectee = False
+            for enum in enumerations:
+                if value == enum:
+                    enumeration_respectee = True
+                    break
+            if not enumeration_respectee:
+                warnings_.warn('Value "%(value)s" does not match xsd enumeration restriction on gateTypes' % {"value" : value.encode("utf-8")} )
     def hasContent_(self):
         if (
             self.notes is not None or
@@ -9146,20 +8304,22 @@ class GateHHTauInf(Base):
             eol_ = '\n'
         else:
             eol_ = ''
+        if self.original_tagname_ is not None:
+            name_ = self.original_tagname_
         showIndent(outfile, level, pretty_print)
         outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
         already_processed = set()
         self.exportAttributes(outfile, level, already_processed, namespace_, name_='GateHHTauInf')
         if self.hasContent_():
             outfile.write('>%s' % (eol_, ))
-            self.exportChildren(outfile, level + 1, namespace_, name_, pretty_print=pretty_print)
+            self.exportChildren(outfile, level + 1, namespace_='', name_='GateHHTauInf', pretty_print=pretty_print)
             showIndent(outfile, level, pretty_print)
             outfile.write('</%s%s>%s' % (namespace_, name_, eol_))
         else:
             outfile.write('/>%s' % (eol_, ))
     def exportAttributes(self, outfile, level, already_processed, namespace_='', name_='GateHHTauInf'):
         super(GateHHTauInf, self).exportAttributes(outfile, level, already_processed, namespace_, name_='GateHHTauInf')
-        if self.instances is not None and 'instances' not in already_processed:
+        if self.instances != 1 and 'instances' not in already_processed:
             already_processed.add('instances')
             outfile.write(' instances="%s"' % self.gds_format_integer(self.instances, input_name='instances'))
         if self.type_ is not None and 'type_' not in already_processed:
@@ -9173,58 +8333,20 @@ class GateHHTauInf(Base):
             eol_ = ''
         if self.notes is not None:
             showIndent(outfile, level, pretty_print)
-            outfile.write('<%snotes>%s</%snotes>%s' % (namespace_, self.gds_format_string(quote_xml(self.notes).encode(ExternalEncoding), input_name='notes'), namespace_, eol_))
+            outfile.write('<%snotes>%s</%snotes>%s' % (namespace_, self.gds_encode(self.gds_format_string(quote_xml(self.notes), input_name='notes')), namespace_, eol_))
         if self.q10Settings is not None:
             self.q10Settings.export(outfile, level, namespace_, name_='q10Settings', pretty_print=pretty_print)
         if self.timeCourse is not None:
             self.timeCourse.export(outfile, level, namespace_, name_='timeCourse', pretty_print=pretty_print)
         if self.steadyState is not None:
             self.steadyState.export(outfile, level, namespace_, name_='steadyState', pretty_print=pretty_print)
-    def exportLiteral(self, outfile, level, name_='GateHHTauInf'):
-        level += 1
-        already_processed = set()
-        self.exportLiteralAttributes(outfile, level, already_processed, name_)
-        if self.hasContent_():
-            self.exportLiteralChildren(outfile, level, name_)
-    def exportLiteralAttributes(self, outfile, level, already_processed, name_):
-        if self.instances is not None and 'instances' not in already_processed:
-            already_processed.add('instances')
-            showIndent(outfile, level)
-            outfile.write('instances=%d,\n' % (self.instances,))
-        if self.type_ is not None and 'type_' not in already_processed:
-            already_processed.add('type_')
-            showIndent(outfile, level)
-            outfile.write('type_="%s",\n' % (self.type_,))
-        super(GateHHTauInf, self).exportLiteralAttributes(outfile, level, already_processed, name_)
-    def exportLiteralChildren(self, outfile, level, name_):
-        super(GateHHTauInf, self).exportLiteralChildren(outfile, level, name_)
-        if self.notes is not None:
-            showIndent(outfile, level)
-            outfile.write('notes=%s,\n' % quote_python(self.notes).encode(ExternalEncoding))
-        if self.q10Settings is not None:
-            showIndent(outfile, level)
-            outfile.write('q10Settings=model_.Q10Settings(\n')
-            self.q10Settings.exportLiteral(outfile, level, name_='q10Settings')
-            showIndent(outfile, level)
-            outfile.write('),\n')
-        if self.timeCourse is not None:
-            showIndent(outfile, level)
-            outfile.write('timeCourse=model_.HHTime(\n')
-            self.timeCourse.exportLiteral(outfile, level, name_='timeCourse')
-            showIndent(outfile, level)
-            outfile.write('),\n')
-        if self.steadyState is not None:
-            showIndent(outfile, level)
-            outfile.write('steadyState=model_.HHVariable(\n')
-            self.steadyState.exportLiteral(outfile, level, name_='steadyState')
-            showIndent(outfile, level)
-            outfile.write('),\n')
     def build(self, node):
         already_processed = set()
         self.buildAttributes(node, node.attrib, already_processed)
         for child in node:
             nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
             self.buildChildren(child, node, nodeName_)
+        return self
     def buildAttributes(self, node, attrs, already_processed):
         value = find_attr_value_('instances', node)
         if value is not None and 'instances' not in already_processed:
@@ -9244,19 +8366,23 @@ class GateHHTauInf(Base):
             notes_ = child_.text
             notes_ = self.gds_validate_string(notes_, node, 'notes')
             self.notes = notes_
-            self.validate_Notes(self.notes)    # validate type Notes
+            # validate type Notes
+            self.validate_Notes(self.notes)
         elif nodeName_ == 'q10Settings':
             obj_ = Q10Settings.factory()
             obj_.build(child_)
-            self.set_q10Settings(obj_)
+            self.q10Settings = obj_
+            obj_.original_tagname_ = 'q10Settings'
         elif nodeName_ == 'timeCourse':
             obj_ = HHTime.factory()
             obj_.build(child_)
-            self.set_timeCourse(obj_)
+            self.timeCourse = obj_
+            obj_.original_tagname_ = 'timeCourse'
         elif nodeName_ == 'steadyState':
             obj_ = HHVariable.factory()
             obj_.build(child_)
-            self.set_steadyState(obj_)
+            self.steadyState = obj_
+            obj_.original_tagname_ = 'steadyState'
         super(GateHHTauInf, self).buildChildren(child_, node, nodeName_, True)
 # end class GateHHTauInf
 
@@ -9265,14 +8391,21 @@ class GateHHRates(Base):
     subclass = None
     superclass = Base
     def __init__(self, id=None, neuroLexId=None, instances=1, type_=None, notes=None, q10Settings=None, forwardRate=None, reverseRate=None):
+        self.original_tagname_ = None
         super(GateHHRates, self).__init__(id, neuroLexId, )
         self.instances = _cast(int, instances)
         self.type_ = _cast(None, type_)
         self.notes = notes
+        self.validate_Notes(self.notes)
         self.q10Settings = q10Settings
         self.forwardRate = forwardRate
         self.reverseRate = reverseRate
     def factory(*args_, **kwargs_):
+        if CurrentSubclassModule_ is not None:
+            subclass = getSubclassFromModule_(
+                CurrentSubclassModule_, GateHHRates)
+            if subclass is not None:
+                return subclass(*args_, **kwargs_)
         if GateHHRates.subclass:
             return GateHHRates.subclass(*args_, **kwargs_)
         else:
@@ -9280,9 +8413,6 @@ class GateHHRates(Base):
     factory = staticmethod(factory)
     def get_notes(self): return self.notes
     def set_notes(self, notes): self.notes = notes
-    def validate_Notes(self, value):
-        # Validate type Notes, a restriction on xs:string.
-        pass
     def get_q10Settings(self): return self.q10Settings
     def set_q10Settings(self, q10Settings): self.q10Settings = q10Settings
     def get_forwardRate(self): return self.forwardRate
@@ -9293,9 +8423,22 @@ class GateHHRates(Base):
     def set_instances(self, instances): self.instances = instances
     def get_type(self): return self.type_
     def set_type(self, type_): self.type_ = type_
+    def validate_Notes(self, value):
+        # Validate type Notes, a restriction on xs:string.
+        if value is not None and Validate_simpletypes_:
+            pass
     def validate_gateTypes(self, value):
         # Validate type gateTypes, a restriction on xs:string.
-        pass
+        if value is not None and Validate_simpletypes_:
+            value = str(value)
+            enumerations = ['gateHHrates', 'gateHHratesTau', 'gateHHtauInf', 'gateHHratesInf', 'gateKS']
+            enumeration_respectee = False
+            for enum in enumerations:
+                if value == enum:
+                    enumeration_respectee = True
+                    break
+            if not enumeration_respectee:
+                warnings_.warn('Value "%(value)s" does not match xsd enumeration restriction on gateTypes' % {"value" : value.encode("utf-8")} )
     def hasContent_(self):
         if (
             self.notes is not None or
@@ -9312,20 +8455,22 @@ class GateHHRates(Base):
             eol_ = '\n'
         else:
             eol_ = ''
+        if self.original_tagname_ is not None:
+            name_ = self.original_tagname_
         showIndent(outfile, level, pretty_print)
         outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
         already_processed = set()
         self.exportAttributes(outfile, level, already_processed, namespace_, name_='GateHHRates')
         if self.hasContent_():
             outfile.write('>%s' % (eol_, ))
-            self.exportChildren(outfile, level + 1, namespace_, name_, pretty_print=pretty_print)
+            self.exportChildren(outfile, level + 1, namespace_='', name_='GateHHRates', pretty_print=pretty_print)
             showIndent(outfile, level, pretty_print)
             outfile.write('</%s%s>%s' % (namespace_, name_, eol_))
         else:
             outfile.write('/>%s' % (eol_, ))
     def exportAttributes(self, outfile, level, already_processed, namespace_='', name_='GateHHRates'):
         super(GateHHRates, self).exportAttributes(outfile, level, already_processed, namespace_, name_='GateHHRates')
-        if self.instances is not None and 'instances' not in already_processed:
+        if self.instances != 1 and 'instances' not in already_processed:
             already_processed.add('instances')
             outfile.write(' instances="%s"' % self.gds_format_integer(self.instances, input_name='instances'))
         if self.type_ is not None and 'type_' not in already_processed:
@@ -9339,58 +8484,20 @@ class GateHHRates(Base):
             eol_ = ''
         if self.notes is not None:
             showIndent(outfile, level, pretty_print)
-            outfile.write('<%snotes>%s</%snotes>%s' % (namespace_, self.gds_format_string(quote_xml(self.notes).encode(ExternalEncoding), input_name='notes'), namespace_, eol_))
+            outfile.write('<%snotes>%s</%snotes>%s' % (namespace_, self.gds_encode(self.gds_format_string(quote_xml(self.notes), input_name='notes')), namespace_, eol_))
         if self.q10Settings is not None:
             self.q10Settings.export(outfile, level, namespace_, name_='q10Settings', pretty_print=pretty_print)
         if self.forwardRate is not None:
             self.forwardRate.export(outfile, level, namespace_, name_='forwardRate', pretty_print=pretty_print)
         if self.reverseRate is not None:
             self.reverseRate.export(outfile, level, namespace_, name_='reverseRate', pretty_print=pretty_print)
-    def exportLiteral(self, outfile, level, name_='GateHHRates'):
-        level += 1
-        already_processed = set()
-        self.exportLiteralAttributes(outfile, level, already_processed, name_)
-        if self.hasContent_():
-            self.exportLiteralChildren(outfile, level, name_)
-    def exportLiteralAttributes(self, outfile, level, already_processed, name_):
-        if self.instances is not None and 'instances' not in already_processed:
-            already_processed.add('instances')
-            showIndent(outfile, level)
-            outfile.write('instances=%d,\n' % (self.instances,))
-        if self.type_ is not None and 'type_' not in already_processed:
-            already_processed.add('type_')
-            showIndent(outfile, level)
-            outfile.write('type_="%s",\n' % (self.type_,))
-        super(GateHHRates, self).exportLiteralAttributes(outfile, level, already_processed, name_)
-    def exportLiteralChildren(self, outfile, level, name_):
-        super(GateHHRates, self).exportLiteralChildren(outfile, level, name_)
-        if self.notes is not None:
-            showIndent(outfile, level)
-            outfile.write('notes=%s,\n' % quote_python(self.notes).encode(ExternalEncoding))
-        if self.q10Settings is not None:
-            showIndent(outfile, level)
-            outfile.write('q10Settings=model_.Q10Settings(\n')
-            self.q10Settings.exportLiteral(outfile, level, name_='q10Settings')
-            showIndent(outfile, level)
-            outfile.write('),\n')
-        if self.forwardRate is not None:
-            showIndent(outfile, level)
-            outfile.write('forwardRate=model_.HHRate(\n')
-            self.forwardRate.exportLiteral(outfile, level, name_='forwardRate')
-            showIndent(outfile, level)
-            outfile.write('),\n')
-        if self.reverseRate is not None:
-            showIndent(outfile, level)
-            outfile.write('reverseRate=model_.HHRate(\n')
-            self.reverseRate.exportLiteral(outfile, level, name_='reverseRate')
-            showIndent(outfile, level)
-            outfile.write('),\n')
     def build(self, node):
         already_processed = set()
         self.buildAttributes(node, node.attrib, already_processed)
         for child in node:
             nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
             self.buildChildren(child, node, nodeName_)
+        return self
     def buildAttributes(self, node, attrs, already_processed):
         value = find_attr_value_('instances', node)
         if value is not None and 'instances' not in already_processed:
@@ -9410,19 +8517,23 @@ class GateHHRates(Base):
             notes_ = child_.text
             notes_ = self.gds_validate_string(notes_, node, 'notes')
             self.notes = notes_
-            self.validate_Notes(self.notes)    # validate type Notes
+            # validate type Notes
+            self.validate_Notes(self.notes)
         elif nodeName_ == 'q10Settings':
             obj_ = Q10Settings.factory()
             obj_.build(child_)
-            self.set_q10Settings(obj_)
+            self.q10Settings = obj_
+            obj_.original_tagname_ = 'q10Settings'
         elif nodeName_ == 'forwardRate':
             obj_ = HHRate.factory()
             obj_.build(child_)
-            self.set_forwardRate(obj_)
+            self.forwardRate = obj_
+            obj_.original_tagname_ = 'forwardRate'
         elif nodeName_ == 'reverseRate':
             obj_ = HHRate.factory()
             obj_.build(child_)
-            self.set_reverseRate(obj_)
+            self.reverseRate = obj_
+            obj_.original_tagname_ = 'reverseRate'
         super(GateHHRates, self).buildChildren(child_, node, nodeName_, True)
 # end class GateHHRates
 
@@ -9431,16 +8542,23 @@ class GateHHUndetermined(Base):
     subclass = None
     superclass = Base
     def __init__(self, id=None, neuroLexId=None, instances=1, type_=None, notes=None, q10Settings=None, forwardRate=None, reverseRate=None, timeCourse=None, steadyState=None):
+        self.original_tagname_ = None
         super(GateHHUndetermined, self).__init__(id, neuroLexId, )
         self.instances = _cast(int, instances)
         self.type_ = _cast(None, type_)
         self.notes = notes
+        self.validate_Notes(self.notes)
         self.q10Settings = q10Settings
         self.forwardRate = forwardRate
         self.reverseRate = reverseRate
         self.timeCourse = timeCourse
         self.steadyState = steadyState
     def factory(*args_, **kwargs_):
+        if CurrentSubclassModule_ is not None:
+            subclass = getSubclassFromModule_(
+                CurrentSubclassModule_, GateHHUndetermined)
+            if subclass is not None:
+                return subclass(*args_, **kwargs_)
         if GateHHUndetermined.subclass:
             return GateHHUndetermined.subclass(*args_, **kwargs_)
         else:
@@ -9448,9 +8566,6 @@ class GateHHUndetermined(Base):
     factory = staticmethod(factory)
     def get_notes(self): return self.notes
     def set_notes(self, notes): self.notes = notes
-    def validate_Notes(self, value):
-        # Validate type Notes, a restriction on xs:string.
-        pass
     def get_q10Settings(self): return self.q10Settings
     def set_q10Settings(self, q10Settings): self.q10Settings = q10Settings
     def get_forwardRate(self): return self.forwardRate
@@ -9465,9 +8580,22 @@ class GateHHUndetermined(Base):
     def set_instances(self, instances): self.instances = instances
     def get_type(self): return self.type_
     def set_type(self, type_): self.type_ = type_
+    def validate_Notes(self, value):
+        # Validate type Notes, a restriction on xs:string.
+        if value is not None and Validate_simpletypes_:
+            pass
     def validate_gateTypes(self, value):
         # Validate type gateTypes, a restriction on xs:string.
-        pass
+        if value is not None and Validate_simpletypes_:
+            value = str(value)
+            enumerations = ['gateHHrates', 'gateHHratesTau', 'gateHHtauInf', 'gateHHratesInf', 'gateKS']
+            enumeration_respectee = False
+            for enum in enumerations:
+                if value == enum:
+                    enumeration_respectee = True
+                    break
+            if not enumeration_respectee:
+                warnings_.warn('Value "%(value)s" does not match xsd enumeration restriction on gateTypes' % {"value" : value.encode("utf-8")} )
     def hasContent_(self):
         if (
             self.notes is not None or
@@ -9486,20 +8614,22 @@ class GateHHUndetermined(Base):
             eol_ = '\n'
         else:
             eol_ = ''
+        if self.original_tagname_ is not None:
+            name_ = self.original_tagname_
         showIndent(outfile, level, pretty_print)
         outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
         already_processed = set()
         self.exportAttributes(outfile, level, already_processed, namespace_, name_='GateHHUndetermined')
         if self.hasContent_():
             outfile.write('>%s' % (eol_, ))
-            self.exportChildren(outfile, level + 1, namespace_, name_, pretty_print=pretty_print)
+            self.exportChildren(outfile, level + 1, namespace_='', name_='GateHHUndetermined', pretty_print=pretty_print)
             showIndent(outfile, level, pretty_print)
             outfile.write('</%s%s>%s' % (namespace_, name_, eol_))
         else:
             outfile.write('/>%s' % (eol_, ))
     def exportAttributes(self, outfile, level, already_processed, namespace_='', name_='GateHHUndetermined'):
         super(GateHHUndetermined, self).exportAttributes(outfile, level, already_processed, namespace_, name_='GateHHUndetermined')
-        if self.instances is not None and 'instances' not in already_processed:
+        if self.instances != 1 and 'instances' not in already_processed:
             already_processed.add('instances')
             outfile.write(' instances="%s"' % self.gds_format_integer(self.instances, input_name='instances'))
         if self.type_ is not None and 'type_' not in already_processed:
@@ -9513,7 +8643,7 @@ class GateHHUndetermined(Base):
             eol_ = ''
         if self.notes is not None:
             showIndent(outfile, level, pretty_print)
-            outfile.write('<%snotes>%s</%snotes>%s' % (namespace_, self.gds_format_string(quote_xml(self.notes).encode(ExternalEncoding), input_name='notes'), namespace_, eol_))
+            outfile.write('<%snotes>%s</%snotes>%s' % (namespace_, self.gds_encode(self.gds_format_string(quote_xml(self.notes), input_name='notes')), namespace_, eol_))
         if self.q10Settings is not None:
             self.q10Settings.export(outfile, level, namespace_, name_='q10Settings', pretty_print=pretty_print)
         if self.forwardRate is not None:
@@ -9524,66 +8654,16 @@ class GateHHUndetermined(Base):
             self.timeCourse.export(outfile, level, namespace_, name_='timeCourse', pretty_print=pretty_print)
         if self.steadyState is not None:
             self.steadyState.export(outfile, level, namespace_, name_='steadyState', pretty_print=pretty_print)
-    def exportLiteral(self, outfile, level, name_='GateHHUndetermined'):
-        level += 1
+    def build(self, node):
         already_processed = set()
-        self.exportLiteralAttributes(outfile, level, already_processed, name_)
-        if self.hasContent_():
-            self.exportLiteralChildren(outfile, level, name_)
-    def exportLiteralAttributes(self, outfile, level, already_processed, name_):
-        if self.instances is not None and 'instances' not in already_processed:
-            already_processed.add('instances')
-            showIndent(outfile, level)
-            outfile.write('instances=%d,\n' % (self.instances,))
-        if self.type_ is not None and 'type_' not in already_processed:
-            already_processed.add('type_')
-            showIndent(outfile, level)
-            outfile.write('type_="%s",\n' % (self.type_,))
-        super(GateHHUndetermined, self).exportLiteralAttributes(outfile, level, already_processed, name_)
-    def exportLiteralChildren(self, outfile, level, name_):
-        super(GateHHUndetermined, self).exportLiteralChildren(outfile, level, name_)
-        if self.notes is not None:
-            showIndent(outfile, level)
-            outfile.write('notes=%s,\n' % quote_python(self.notes).encode(ExternalEncoding))
-        if self.q10Settings is not None:
-            showIndent(outfile, level)
-            outfile.write('q10Settings=model_.Q10Settings(\n')
-            self.q10Settings.exportLiteral(outfile, level, name_='q10Settings')
-            showIndent(outfile, level)
-            outfile.write('),\n')
-        if self.forwardRate is not None:
-            showIndent(outfile, level)
-            outfile.write('forwardRate=model_.HHRate(\n')
-            self.forwardRate.exportLiteral(outfile, level, name_='forwardRate')
-            showIndent(outfile, level)
-            outfile.write('),\n')
-        if self.reverseRate is not None:
-            showIndent(outfile, level)
-            outfile.write('reverseRate=model_.HHRate(\n')
-            self.reverseRate.exportLiteral(outfile, level, name_='reverseRate')
-            showIndent(outfile, level)
-            outfile.write('),\n')
-        if self.timeCourse is not None:
-            showIndent(outfile, level)
-            outfile.write('timeCourse=model_.HHTime(\n')
-            self.timeCourse.exportLiteral(outfile, level, name_='timeCourse')
-            showIndent(outfile, level)
-            outfile.write('),\n')
-        if self.steadyState is not None:
-            showIndent(outfile, level)
-            outfile.write('steadyState=model_.HHVariable(\n')
-            self.steadyState.exportLiteral(outfile, level, name_='steadyState')
-            showIndent(outfile, level)
-            outfile.write('),\n')
-    def build(self, node):
-        already_processed = set()
-        self.buildAttributes(node, node.attrib, already_processed)
-        for child in node:
-            nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
-            self.buildChildren(child, node, nodeName_)
-    def buildAttributes(self, node, attrs, already_processed):
-        value = find_attr_value_('instances', node)
-        if value is not None and 'instances' not in already_processed:
+        self.buildAttributes(node, node.attrib, already_processed)
+        for child in node:
+            nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
+            self.buildChildren(child, node, nodeName_)
+        return self
+    def buildAttributes(self, node, attrs, already_processed):
+        value = find_attr_value_('instances', node)
+        if value is not None and 'instances' not in already_processed:
             already_processed.add('instances')
             try:
                 self.instances = int(value)
@@ -9600,27 +8680,33 @@ class GateHHUndetermined(Base):
             notes_ = child_.text
             notes_ = self.gds_validate_string(notes_, node, 'notes')
             self.notes = notes_
-            self.validate_Notes(self.notes)    # validate type Notes
+            # validate type Notes
+            self.validate_Notes(self.notes)
         elif nodeName_ == 'q10Settings':
             obj_ = Q10Settings.factory()
             obj_.build(child_)
-            self.set_q10Settings(obj_)
+            self.q10Settings = obj_
+            obj_.original_tagname_ = 'q10Settings'
         elif nodeName_ == 'forwardRate':
             obj_ = HHRate.factory()
             obj_.build(child_)
-            self.set_forwardRate(obj_)
+            self.forwardRate = obj_
+            obj_.original_tagname_ = 'forwardRate'
         elif nodeName_ == 'reverseRate':
             obj_ = HHRate.factory()
             obj_.build(child_)
-            self.set_reverseRate(obj_)
+            self.reverseRate = obj_
+            obj_.original_tagname_ = 'reverseRate'
         elif nodeName_ == 'timeCourse':
             obj_ = HHTime.factory()
             obj_.build(child_)
-            self.set_timeCourse(obj_)
+            self.timeCourse = obj_
+            obj_.original_tagname_ = 'timeCourse'
         elif nodeName_ == 'steadyState':
             obj_ = HHVariable.factory()
             obj_.build(child_)
-            self.set_steadyState(obj_)
+            self.steadyState = obj_
+            obj_.original_tagname_ = 'steadyState'
         super(GateHHUndetermined, self).buildChildren(child_, node, nodeName_, True)
 # end class GateHHUndetermined
 
@@ -9628,11 +8714,12 @@ class GateHHUndetermined(Base):
 class IonChannel(Standalone):
     subclass = None
     superclass = Standalone
-    def __init__(self, id=None, neuroLexId=None, name=None, metaid=None, notes=None, annotation=None, conductance=None, type_=None, species=None, gate=None, gateHHrates=None, gateHHratesTau=None, gateHHtauInf=None, gateHHratesInf=None):
+    def __init__(self, id=None, neuroLexId=None, name=None, metaid=None, notes=None, annotation=None, species=None, type_=None, conductance=None, gate=None, gateHHrates=None, gateHHratesTau=None, gateHHtauInf=None, gateHHratesInf=None):
+        self.original_tagname_ = None
         super(IonChannel, self).__init__(id, neuroLexId, name, metaid, notes, annotation, )
-        self.conductance = _cast(None, conductance)
-        self.type_ = _cast(None, type_)
         self.species = _cast(None, species)
+        self.type_ = _cast(None, type_)
+        self.conductance = _cast(None, conductance)
         if gate is None:
             self.gate = []
         else:
@@ -9654,6 +8741,11 @@ class IonChannel(Standalone):
         else:
             self.gateHHratesInf = gateHHratesInf
     def factory(*args_, **kwargs_):
+        if CurrentSubclassModule_ is not None:
+            subclass = getSubclassFromModule_(
+                CurrentSubclassModule_, IonChannel)
+            if subclass is not None:
+                return subclass(*args_, **kwargs_)
         if IonChannel.subclass:
             return IonChannel.subclass(*args_, **kwargs_)
         else:
@@ -9662,38 +8754,60 @@ class IonChannel(Standalone):
     def get_gate(self): return self.gate
     def set_gate(self, gate): self.gate = gate
     def add_gate(self, value): self.gate.append(value)
-    def insert_gate(self, index, value): self.gate[index] = value
+    def insert_gate_at(self, index, value): self.gate.insert(index, value)
+    def replace_gate_at(self, index, value): self.gate[index] = value
     def get_gateHHrates(self): return self.gateHHrates
     def set_gateHHrates(self, gateHHrates): self.gateHHrates = gateHHrates
     def add_gateHHrates(self, value): self.gateHHrates.append(value)
-    def insert_gateHHrates(self, index, value): self.gateHHrates[index] = value
+    def insert_gateHHrates_at(self, index, value): self.gateHHrates.insert(index, value)
+    def replace_gateHHrates_at(self, index, value): self.gateHHrates[index] = value
     def get_gateHHratesTau(self): return self.gateHHratesTau
     def set_gateHHratesTau(self, gateHHratesTau): self.gateHHratesTau = gateHHratesTau
     def add_gateHHratesTau(self, value): self.gateHHratesTau.append(value)
-    def insert_gateHHratesTau(self, index, value): self.gateHHratesTau[index] = value
+    def insert_gateHHratesTau_at(self, index, value): self.gateHHratesTau.insert(index, value)
+    def replace_gateHHratesTau_at(self, index, value): self.gateHHratesTau[index] = value
     def get_gateHHtauInf(self): return self.gateHHtauInf
     def set_gateHHtauInf(self, gateHHtauInf): self.gateHHtauInf = gateHHtauInf
     def add_gateHHtauInf(self, value): self.gateHHtauInf.append(value)
-    def insert_gateHHtauInf(self, index, value): self.gateHHtauInf[index] = value
+    def insert_gateHHtauInf_at(self, index, value): self.gateHHtauInf.insert(index, value)
+    def replace_gateHHtauInf_at(self, index, value): self.gateHHtauInf[index] = value
     def get_gateHHratesInf(self): return self.gateHHratesInf
     def set_gateHHratesInf(self, gateHHratesInf): self.gateHHratesInf = gateHHratesInf
     def add_gateHHratesInf(self, value): self.gateHHratesInf.append(value)
-    def insert_gateHHratesInf(self, index, value): self.gateHHratesInf[index] = value
-    def get_conductance(self): return self.conductance
-    def set_conductance(self, conductance): self.conductance = conductance
-    def validate_Nml2Quantity_conductance(self, value):
-        # Validate type Nml2Quantity_conductance, a restriction on xs:string.
-        pass
-    def get_type(self): return self.type_
-    def set_type(self, type_): self.type_ = type_
-    def validate_channelTypes(self, value):
-        # Validate type channelTypes, a restriction on xs:string.
-        pass
+    def insert_gateHHratesInf_at(self, index, value): self.gateHHratesInf.insert(index, value)
+    def replace_gateHHratesInf_at(self, index, value): self.gateHHratesInf[index] = value
     def get_species(self): return self.species
     def set_species(self, species): self.species = species
+    def get_type(self): return self.type_
+    def set_type(self, type_): self.type_ = type_
+    def get_conductance(self): return self.conductance
+    def set_conductance(self, conductance): self.conductance = conductance
     def validate_NmlId(self, value):
         # Validate type NmlId, a restriction on xs:string.
-        pass
+        if value is not None and Validate_simpletypes_:
+            if not self.gds_validate_simple_patterns(
+                    self.validate_NmlId_patterns_, value):
+                warnings_.warn('Value "%s" does not match xsd pattern restrictions: %s' % (value.encode('utf-8'), self.validate_NmlId_patterns_, ))
+    validate_NmlId_patterns_ = [['^[a-zA-Z0-9_]*$']]
+    def validate_channelTypes(self, value):
+        # Validate type channelTypes, a restriction on xs:string.
+        if value is not None and Validate_simpletypes_:
+            value = str(value)
+            enumerations = ['ionChannelPassive', 'ionChannelHH', 'ionChannelKS']
+            enumeration_respectee = False
+            for enum in enumerations:
+                if value == enum:
+                    enumeration_respectee = True
+                    break
+            if not enumeration_respectee:
+                warnings_.warn('Value "%(value)s" does not match xsd enumeration restriction on channelTypes' % {"value" : value.encode("utf-8")} )
+    def validate_Nml2Quantity_conductance(self, value):
+        # Validate type Nml2Quantity_conductance, a restriction on xs:string.
+        if value is not None and Validate_simpletypes_:
+            if not self.gds_validate_simple_patterns(
+                    self.validate_Nml2Quantity_conductance_patterns_, value):
+                warnings_.warn('Value "%s" does not match xsd pattern restrictions: %s' % (value.encode('utf-8'), self.validate_Nml2Quantity_conductance_patterns_, ))
+    validate_Nml2Quantity_conductance_patterns_ = [['^-?([0-9]*(\\.[0-9]+)?)([eE]-?[0-9]+)?[\\s]*(S|mS|uS|nS|pS)$']]
     def hasContent_(self):
         if (
             self.gate or
@@ -9711,28 +8825,30 @@ class IonChannel(Standalone):
             eol_ = '\n'
         else:
             eol_ = ''
+        if self.original_tagname_ is not None:
+            name_ = self.original_tagname_
         showIndent(outfile, level, pretty_print)
         outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
         already_processed = set()
         self.exportAttributes(outfile, level, already_processed, namespace_, name_='IonChannel')
         if self.hasContent_():
             outfile.write('>%s' % (eol_, ))
-            self.exportChildren(outfile, level + 1, namespace_, name_, pretty_print=pretty_print)
+            self.exportChildren(outfile, level + 1, namespace_='', name_='IonChannel', pretty_print=pretty_print)
             showIndent(outfile, level, pretty_print)
             outfile.write('</%s%s>%s' % (namespace_, name_, eol_))
         else:
             outfile.write('/>%s' % (eol_, ))
     def exportAttributes(self, outfile, level, already_processed, namespace_='', name_='IonChannel'):
         super(IonChannel, self).exportAttributes(outfile, level, already_processed, namespace_, name_='IonChannel')
-        if self.conductance is not None and 'conductance' not in already_processed:
-            already_processed.add('conductance')
-            outfile.write(' conductance=%s' % (quote_attrib(self.conductance), ))
-        if self.type_ is not None and 'type_' not in already_processed:
-            already_processed.add('type_')
-            outfile.write(' type=%s' % (quote_attrib(self.type_), ))
         if self.species is not None and 'species' not in already_processed:
             already_processed.add('species')
             outfile.write(' species=%s' % (quote_attrib(self.species), ))
+        if self.type_ is not None and 'type_' not in already_processed:
+            already_processed.add('type_')
+            outfile.write(' type=%s' % (quote_attrib(self.type_), ))
+        if self.conductance is not None and 'conductance' not in already_processed:
+            already_processed.add('conductance')
+            outfile.write(' conductance=%s' % (quote_attrib(self.conductance), ))
     def exportChildren(self, outfile, level, namespace_='', name_='IonChannel', fromsubclass_=False, pretty_print=True):
         super(IonChannel, self).exportChildren(outfile, level, namespace_, name_, True, pretty_print=pretty_print)
         if pretty_print:
@@ -9749,132 +8865,56 @@ class IonChannel(Standalone):
             gateHHtauInf_.export(outfile, level, namespace_, name_='gateHHtauInf', pretty_print=pretty_print)
         for gateHHratesInf_ in self.gateHHratesInf:
             gateHHratesInf_.export(outfile, level, namespace_, name_='gateHHratesInf', pretty_print=pretty_print)
-    def exportLiteral(self, outfile, level, name_='IonChannel'):
-        level += 1
-        already_processed = set()
-        self.exportLiteralAttributes(outfile, level, already_processed, name_)
-        if self.hasContent_():
-            self.exportLiteralChildren(outfile, level, name_)
-    def exportLiteralAttributes(self, outfile, level, already_processed, name_):
-        if self.conductance is not None and 'conductance' not in already_processed:
-            already_processed.add('conductance')
-            showIndent(outfile, level)
-            outfile.write('conductance="%s",\n' % (self.conductance,))
-        if self.type_ is not None and 'type_' not in already_processed:
-            already_processed.add('type_')
-            showIndent(outfile, level)
-            outfile.write('type_="%s",\n' % (self.type_,))
-        if self.species is not None and 'species' not in already_processed:
-            already_processed.add('species')
-            showIndent(outfile, level)
-            outfile.write('species="%s",\n' % (self.species,))
-        super(IonChannel, self).exportLiteralAttributes(outfile, level, already_processed, name_)
-    def exportLiteralChildren(self, outfile, level, name_):
-        super(IonChannel, self).exportLiteralChildren(outfile, level, name_)
-        showIndent(outfile, level)
-        outfile.write('gate=[\n')
-        level += 1
-        for gate_ in self.gate:
-            showIndent(outfile, level)
-            outfile.write('model_.GateHHUndetermined(\n')
-            gate_.exportLiteral(outfile, level, name_='GateHHUndetermined')
-            showIndent(outfile, level)
-            outfile.write('),\n')
-        level -= 1
-        showIndent(outfile, level)
-        outfile.write('],\n')
-        showIndent(outfile, level)
-        outfile.write('gateHHrates=[\n')
-        level += 1
-        for gateHHrates_ in self.gateHHrates:
-            showIndent(outfile, level)
-            outfile.write('model_.GateHHRates(\n')
-            gateHHrates_.exportLiteral(outfile, level, name_='GateHHRates')
-            showIndent(outfile, level)
-            outfile.write('),\n')
-        level -= 1
-        showIndent(outfile, level)
-        outfile.write('],\n')
-        showIndent(outfile, level)
-        outfile.write('gateHHratesTau=[\n')
-        level += 1
-        for gateHHratesTau_ in self.gateHHratesTau:
-            showIndent(outfile, level)
-            outfile.write('model_.GateHHRatesTau(\n')
-            gateHHratesTau_.exportLiteral(outfile, level, name_='GateHHRatesTau')
-            showIndent(outfile, level)
-            outfile.write('),\n')
-        level -= 1
-        showIndent(outfile, level)
-        outfile.write('],\n')
-        showIndent(outfile, level)
-        outfile.write('gateHHtauInf=[\n')
-        level += 1
-        for gateHHtauInf_ in self.gateHHtauInf:
-            showIndent(outfile, level)
-            outfile.write('model_.GateHHTauInf(\n')
-            gateHHtauInf_.exportLiteral(outfile, level, name_='GateHHTauInf')
-            showIndent(outfile, level)
-            outfile.write('),\n')
-        level -= 1
-        showIndent(outfile, level)
-        outfile.write('],\n')
-        showIndent(outfile, level)
-        outfile.write('gateHHratesInf=[\n')
-        level += 1
-        for gateHHratesInf_ in self.gateHHratesInf:
-            showIndent(outfile, level)
-            outfile.write('model_.GateHHRatesInf(\n')
-            gateHHratesInf_.exportLiteral(outfile, level, name_='GateHHRatesInf')
-            showIndent(outfile, level)
-            outfile.write('),\n')
-        level -= 1
-        showIndent(outfile, level)
-        outfile.write('],\n')
     def build(self, node):
         already_processed = set()
         self.buildAttributes(node, node.attrib, already_processed)
         for child in node:
             nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
             self.buildChildren(child, node, nodeName_)
+        return self
     def buildAttributes(self, node, attrs, already_processed):
-        value = find_attr_value_('conductance', node)
-        if value is not None and 'conductance' not in already_processed:
-            already_processed.add('conductance')
-            self.conductance = value
-            self.validate_Nml2Quantity_conductance(self.conductance)    # validate type Nml2Quantity_conductance
-        value = find_attr_value_('type', node)
-        if value is not None and 'type' not in already_processed:
-            already_processed.add('type')
-            self.type_ = value
-            self.validate_channelTypes(self.type_)    # validate type channelTypes
         value = find_attr_value_('species', node)
         if value is not None and 'species' not in already_processed:
             already_processed.add('species')
             self.species = value
             self.validate_NmlId(self.species)    # validate type NmlId
+        value = find_attr_value_('type', node)
+        if value is not None and 'type' not in already_processed:
+            already_processed.add('type')
+            self.type_ = value
+            self.validate_channelTypes(self.type_)    # validate type channelTypes
+        value = find_attr_value_('conductance', node)
+        if value is not None and 'conductance' not in already_processed:
+            already_processed.add('conductance')
+            self.conductance = value
+            self.validate_Nml2Quantity_conductance(self.conductance)    # validate type Nml2Quantity_conductance
         super(IonChannel, self).buildAttributes(node, attrs, already_processed)
     def buildChildren(self, child_, node, nodeName_, fromsubclass_=False):
         if nodeName_ == 'gate':
             obj_ = GateHHUndetermined.factory()
             obj_.build(child_)
             self.gate.append(obj_)
+            obj_.original_tagname_ = 'gate'
         elif nodeName_ == 'gateHHrates':
             obj_ = GateHHRates.factory()
             obj_.build(child_)
             self.gateHHrates.append(obj_)
+            obj_.original_tagname_ = 'gateHHrates'
         elif nodeName_ == 'gateHHratesTau':
             obj_ = GateHHRatesTau.factory()
             obj_.build(child_)
             self.gateHHratesTau.append(obj_)
+            obj_.original_tagname_ = 'gateHHratesTau'
         elif nodeName_ == 'gateHHtauInf':
             obj_ = GateHHTauInf.factory()
             obj_.build(child_)
             self.gateHHtauInf.append(obj_)
+            obj_.original_tagname_ = 'gateHHtauInf'
         elif nodeName_ == 'gateHHratesInf':
             obj_ = GateHHRatesInf.factory()
             obj_.build(child_)
             self.gateHHratesInf.append(obj_)
+            obj_.original_tagname_ = 'gateHHratesInf'
         super(IonChannel, self).buildChildren(child_, node, nodeName_, True)
 # end class IonChannel
 
@@ -9883,6 +8923,7 @@ class NeuroMLDocument(Standalone):
     subclass = None
     superclass = Standalone
     def __init__(self, id=None, neuroLexId=None, name=None, metaid=None, notes=None, annotation=None, include=None, extracellularProperties=None, intracellularProperties=None, morphology=None, ionChannel=None, decayingPoolConcentrationModel=None, expOneSynapse=None, expTwoSynapse=None, blockingPlasticSynapse=None, biophysicalProperties=None, cell=None, baseCell=None, iafTauCell=None, iafTauRefCell=None, iafCell=None, iafRefCell=None, izhikevichCell=None, adExIaFCell=None, pulseGenerator=None, sineGenerator=None, rampGenerator=None, voltageClamp=None, spikeArray=None, spikeGenerator=None, spikeGeneratorRandom=None, spikeGeneratorPoisson=None, IF_curr_alpha=None, IF_curr_exp=None, IF_cond_alpha=None, IF_cond_exp=None, EIF_cond_exp_isfa_ista=None, EIF_cond_alpha_isfa_ista=None, HH_cond_exp=None, expCondSynapse=None, alphaCondSynapse=None, expCurrSynapse=None, alphaCurrSynapse=None, SpikeSourcePoisson=None, network=None, ComponentType=None):
+        self.original_tagname_ = None
         super(NeuroMLDocument, self).__init__(id, neuroLexId, name, metaid, notes, annotation, )
         if include is None:
             self.include = []
@@ -10045,6 +9086,11 @@ class NeuroMLDocument(Standalone):
         else:
             self.ComponentType = ComponentType
     def factory(*args_, **kwargs_):
+        if CurrentSubclassModule_ is not None:
+            subclass = getSubclassFromModule_(
+                CurrentSubclassModule_, NeuroMLDocument)
+            if subclass is not None:
+                return subclass(*args_, **kwargs_)
         if NeuroMLDocument.subclass:
             return NeuroMLDocument.subclass(*args_, **kwargs_)
         else:
@@ -10053,163 +9099,203 @@ class NeuroMLDocument(Standalone):
     def get_include(self): return self.include
     def set_include(self, include): self.include = include
     def add_include(self, value): self.include.append(value)
-    def insert_include(self, index, value): self.include[index] = value
+    def insert_include_at(self, index, value): self.include.insert(index, value)
+    def replace_include_at(self, index, value): self.include[index] = value
     def get_extracellularProperties(self): return self.extracellularProperties
     def set_extracellularProperties(self, extracellularProperties): self.extracellularProperties = extracellularProperties
     def add_extracellularProperties(self, value): self.extracellularProperties.append(value)
-    def insert_extracellularProperties(self, index, value): self.extracellularProperties[index] = value
+    def insert_extracellularProperties_at(self, index, value): self.extracellularProperties.insert(index, value)
+    def replace_extracellularProperties_at(self, index, value): self.extracellularProperties[index] = value
     def get_intracellularProperties(self): return self.intracellularProperties
     def set_intracellularProperties(self, intracellularProperties): self.intracellularProperties = intracellularProperties
     def add_intracellularProperties(self, value): self.intracellularProperties.append(value)
-    def insert_intracellularProperties(self, index, value): self.intracellularProperties[index] = value
+    def insert_intracellularProperties_at(self, index, value): self.intracellularProperties.insert(index, value)
+    def replace_intracellularProperties_at(self, index, value): self.intracellularProperties[index] = value
     def get_morphology(self): return self.morphology
     def set_morphology(self, morphology): self.morphology = morphology
     def add_morphology(self, value): self.morphology.append(value)
-    def insert_morphology(self, index, value): self.morphology[index] = value
+    def insert_morphology_at(self, index, value): self.morphology.insert(index, value)
+    def replace_morphology_at(self, index, value): self.morphology[index] = value
     def get_ionChannel(self): return self.ionChannel
     def set_ionChannel(self, ionChannel): self.ionChannel = ionChannel
     def add_ionChannel(self, value): self.ionChannel.append(value)
-    def insert_ionChannel(self, index, value): self.ionChannel[index] = value
+    def insert_ionChannel_at(self, index, value): self.ionChannel.insert(index, value)
+    def replace_ionChannel_at(self, index, value): self.ionChannel[index] = value
     def get_decayingPoolConcentrationModel(self): return self.decayingPoolConcentrationModel
     def set_decayingPoolConcentrationModel(self, decayingPoolConcentrationModel): self.decayingPoolConcentrationModel = decayingPoolConcentrationModel
     def add_decayingPoolConcentrationModel(self, value): self.decayingPoolConcentrationModel.append(value)
-    def insert_decayingPoolConcentrationModel(self, index, value): self.decayingPoolConcentrationModel[index] = value
+    def insert_decayingPoolConcentrationModel_at(self, index, value): self.decayingPoolConcentrationModel.insert(index, value)
+    def replace_decayingPoolConcentrationModel_at(self, index, value): self.decayingPoolConcentrationModel[index] = value
     def get_expOneSynapse(self): return self.expOneSynapse
     def set_expOneSynapse(self, expOneSynapse): self.expOneSynapse = expOneSynapse
     def add_expOneSynapse(self, value): self.expOneSynapse.append(value)
-    def insert_expOneSynapse(self, index, value): self.expOneSynapse[index] = value
+    def insert_expOneSynapse_at(self, index, value): self.expOneSynapse.insert(index, value)
+    def replace_expOneSynapse_at(self, index, value): self.expOneSynapse[index] = value
     def get_expTwoSynapse(self): return self.expTwoSynapse
     def set_expTwoSynapse(self, expTwoSynapse): self.expTwoSynapse = expTwoSynapse
     def add_expTwoSynapse(self, value): self.expTwoSynapse.append(value)
-    def insert_expTwoSynapse(self, index, value): self.expTwoSynapse[index] = value
+    def insert_expTwoSynapse_at(self, index, value): self.expTwoSynapse.insert(index, value)
+    def replace_expTwoSynapse_at(self, index, value): self.expTwoSynapse[index] = value
     def get_blockingPlasticSynapse(self): return self.blockingPlasticSynapse
     def set_blockingPlasticSynapse(self, blockingPlasticSynapse): self.blockingPlasticSynapse = blockingPlasticSynapse
     def add_blockingPlasticSynapse(self, value): self.blockingPlasticSynapse.append(value)
-    def insert_blockingPlasticSynapse(self, index, value): self.blockingPlasticSynapse[index] = value
+    def insert_blockingPlasticSynapse_at(self, index, value): self.blockingPlasticSynapse.insert(index, value)
+    def replace_blockingPlasticSynapse_at(self, index, value): self.blockingPlasticSynapse[index] = value
     def get_biophysicalProperties(self): return self.biophysicalProperties
     def set_biophysicalProperties(self, biophysicalProperties): self.biophysicalProperties = biophysicalProperties
     def add_biophysicalProperties(self, value): self.biophysicalProperties.append(value)
-    def insert_biophysicalProperties(self, index, value): self.biophysicalProperties[index] = value
+    def insert_biophysicalProperties_at(self, index, value): self.biophysicalProperties.insert(index, value)
+    def replace_biophysicalProperties_at(self, index, value): self.biophysicalProperties[index] = value
     def get_cell(self): return self.cell
     def set_cell(self, cell): self.cell = cell
     def add_cell(self, value): self.cell.append(value)
-    def insert_cell(self, index, value): self.cell[index] = value
+    def insert_cell_at(self, index, value): self.cell.insert(index, value)
+    def replace_cell_at(self, index, value): self.cell[index] = value
     def get_baseCell(self): return self.baseCell
     def set_baseCell(self, baseCell): self.baseCell = baseCell
     def add_baseCell(self, value): self.baseCell.append(value)
-    def insert_baseCell(self, index, value): self.baseCell[index] = value
+    def insert_baseCell_at(self, index, value): self.baseCell.insert(index, value)
+    def replace_baseCell_at(self, index, value): self.baseCell[index] = value
     def get_iafTauCell(self): return self.iafTauCell
     def set_iafTauCell(self, iafTauCell): self.iafTauCell = iafTauCell
     def add_iafTauCell(self, value): self.iafTauCell.append(value)
-    def insert_iafTauCell(self, index, value): self.iafTauCell[index] = value
+    def insert_iafTauCell_at(self, index, value): self.iafTauCell.insert(index, value)
+    def replace_iafTauCell_at(self, index, value): self.iafTauCell[index] = value
     def get_iafTauRefCell(self): return self.iafTauRefCell
     def set_iafTauRefCell(self, iafTauRefCell): self.iafTauRefCell = iafTauRefCell
     def add_iafTauRefCell(self, value): self.iafTauRefCell.append(value)
-    def insert_iafTauRefCell(self, index, value): self.iafTauRefCell[index] = value
+    def insert_iafTauRefCell_at(self, index, value): self.iafTauRefCell.insert(index, value)
+    def replace_iafTauRefCell_at(self, index, value): self.iafTauRefCell[index] = value
     def get_iafCell(self): return self.iafCell
     def set_iafCell(self, iafCell): self.iafCell = iafCell
     def add_iafCell(self, value): self.iafCell.append(value)
-    def insert_iafCell(self, index, value): self.iafCell[index] = value
+    def insert_iafCell_at(self, index, value): self.iafCell.insert(index, value)
+    def replace_iafCell_at(self, index, value): self.iafCell[index] = value
     def get_iafRefCell(self): return self.iafRefCell
     def set_iafRefCell(self, iafRefCell): self.iafRefCell = iafRefCell
     def add_iafRefCell(self, value): self.iafRefCell.append(value)
-    def insert_iafRefCell(self, index, value): self.iafRefCell[index] = value
+    def insert_iafRefCell_at(self, index, value): self.iafRefCell.insert(index, value)
+    def replace_iafRefCell_at(self, index, value): self.iafRefCell[index] = value
     def get_izhikevichCell(self): return self.izhikevichCell
     def set_izhikevichCell(self, izhikevichCell): self.izhikevichCell = izhikevichCell
     def add_izhikevichCell(self, value): self.izhikevichCell.append(value)
-    def insert_izhikevichCell(self, index, value): self.izhikevichCell[index] = value
+    def insert_izhikevichCell_at(self, index, value): self.izhikevichCell.insert(index, value)
+    def replace_izhikevichCell_at(self, index, value): self.izhikevichCell[index] = value
     def get_adExIaFCell(self): return self.adExIaFCell
     def set_adExIaFCell(self, adExIaFCell): self.adExIaFCell = adExIaFCell
     def add_adExIaFCell(self, value): self.adExIaFCell.append(value)
-    def insert_adExIaFCell(self, index, value): self.adExIaFCell[index] = value
+    def insert_adExIaFCell_at(self, index, value): self.adExIaFCell.insert(index, value)
+    def replace_adExIaFCell_at(self, index, value): self.adExIaFCell[index] = value
     def get_pulseGenerator(self): return self.pulseGenerator
     def set_pulseGenerator(self, pulseGenerator): self.pulseGenerator = pulseGenerator
     def add_pulseGenerator(self, value): self.pulseGenerator.append(value)
-    def insert_pulseGenerator(self, index, value): self.pulseGenerator[index] = value
+    def insert_pulseGenerator_at(self, index, value): self.pulseGenerator.insert(index, value)
+    def replace_pulseGenerator_at(self, index, value): self.pulseGenerator[index] = value
     def get_sineGenerator(self): return self.sineGenerator
     def set_sineGenerator(self, sineGenerator): self.sineGenerator = sineGenerator
     def add_sineGenerator(self, value): self.sineGenerator.append(value)
-    def insert_sineGenerator(self, index, value): self.sineGenerator[index] = value
+    def insert_sineGenerator_at(self, index, value): self.sineGenerator.insert(index, value)
+    def replace_sineGenerator_at(self, index, value): self.sineGenerator[index] = value
     def get_rampGenerator(self): return self.rampGenerator
     def set_rampGenerator(self, rampGenerator): self.rampGenerator = rampGenerator
     def add_rampGenerator(self, value): self.rampGenerator.append(value)
-    def insert_rampGenerator(self, index, value): self.rampGenerator[index] = value
+    def insert_rampGenerator_at(self, index, value): self.rampGenerator.insert(index, value)
+    def replace_rampGenerator_at(self, index, value): self.rampGenerator[index] = value
     def get_voltageClamp(self): return self.voltageClamp
     def set_voltageClamp(self, voltageClamp): self.voltageClamp = voltageClamp
     def add_voltageClamp(self, value): self.voltageClamp.append(value)
-    def insert_voltageClamp(self, index, value): self.voltageClamp[index] = value
+    def insert_voltageClamp_at(self, index, value): self.voltageClamp.insert(index, value)
+    def replace_voltageClamp_at(self, index, value): self.voltageClamp[index] = value
     def get_spikeArray(self): return self.spikeArray
     def set_spikeArray(self, spikeArray): self.spikeArray = spikeArray
     def add_spikeArray(self, value): self.spikeArray.append(value)
-    def insert_spikeArray(self, index, value): self.spikeArray[index] = value
+    def insert_spikeArray_at(self, index, value): self.spikeArray.insert(index, value)
+    def replace_spikeArray_at(self, index, value): self.spikeArray[index] = value
     def get_spikeGenerator(self): return self.spikeGenerator
     def set_spikeGenerator(self, spikeGenerator): self.spikeGenerator = spikeGenerator
     def add_spikeGenerator(self, value): self.spikeGenerator.append(value)
-    def insert_spikeGenerator(self, index, value): self.spikeGenerator[index] = value
+    def insert_spikeGenerator_at(self, index, value): self.spikeGenerator.insert(index, value)
+    def replace_spikeGenerator_at(self, index, value): self.spikeGenerator[index] = value
     def get_spikeGeneratorRandom(self): return self.spikeGeneratorRandom
     def set_spikeGeneratorRandom(self, spikeGeneratorRandom): self.spikeGeneratorRandom = spikeGeneratorRandom
     def add_spikeGeneratorRandom(self, value): self.spikeGeneratorRandom.append(value)
-    def insert_spikeGeneratorRandom(self, index, value): self.spikeGeneratorRandom[index] = value
+    def insert_spikeGeneratorRandom_at(self, index, value): self.spikeGeneratorRandom.insert(index, value)
+    def replace_spikeGeneratorRandom_at(self, index, value): self.spikeGeneratorRandom[index] = value
     def get_spikeGeneratorPoisson(self): return self.spikeGeneratorPoisson
     def set_spikeGeneratorPoisson(self, spikeGeneratorPoisson): self.spikeGeneratorPoisson = spikeGeneratorPoisson
     def add_spikeGeneratorPoisson(self, value): self.spikeGeneratorPoisson.append(value)
-    def insert_spikeGeneratorPoisson(self, index, value): self.spikeGeneratorPoisson[index] = value
+    def insert_spikeGeneratorPoisson_at(self, index, value): self.spikeGeneratorPoisson.insert(index, value)
+    def replace_spikeGeneratorPoisson_at(self, index, value): self.spikeGeneratorPoisson[index] = value
     def get_IF_curr_alpha(self): return self.IF_curr_alpha
     def set_IF_curr_alpha(self, IF_curr_alpha): self.IF_curr_alpha = IF_curr_alpha
     def add_IF_curr_alpha(self, value): self.IF_curr_alpha.append(value)
-    def insert_IF_curr_alpha(self, index, value): self.IF_curr_alpha[index] = value
+    def insert_IF_curr_alpha_at(self, index, value): self.IF_curr_alpha.insert(index, value)
+    def replace_IF_curr_alpha_at(self, index, value): self.IF_curr_alpha[index] = value
     def get_IF_curr_exp(self): return self.IF_curr_exp
     def set_IF_curr_exp(self, IF_curr_exp): self.IF_curr_exp = IF_curr_exp
     def add_IF_curr_exp(self, value): self.IF_curr_exp.append(value)
-    def insert_IF_curr_exp(self, index, value): self.IF_curr_exp[index] = value
+    def insert_IF_curr_exp_at(self, index, value): self.IF_curr_exp.insert(index, value)
+    def replace_IF_curr_exp_at(self, index, value): self.IF_curr_exp[index] = value
     def get_IF_cond_alpha(self): return self.IF_cond_alpha
     def set_IF_cond_alpha(self, IF_cond_alpha): self.IF_cond_alpha = IF_cond_alpha
     def add_IF_cond_alpha(self, value): self.IF_cond_alpha.append(value)
-    def insert_IF_cond_alpha(self, index, value): self.IF_cond_alpha[index] = value
+    def insert_IF_cond_alpha_at(self, index, value): self.IF_cond_alpha.insert(index, value)
+    def replace_IF_cond_alpha_at(self, index, value): self.IF_cond_alpha[index] = value
     def get_IF_cond_exp(self): return self.IF_cond_exp
     def set_IF_cond_exp(self, IF_cond_exp): self.IF_cond_exp = IF_cond_exp
     def add_IF_cond_exp(self, value): self.IF_cond_exp.append(value)
-    def insert_IF_cond_exp(self, index, value): self.IF_cond_exp[index] = value
+    def insert_IF_cond_exp_at(self, index, value): self.IF_cond_exp.insert(index, value)
+    def replace_IF_cond_exp_at(self, index, value): self.IF_cond_exp[index] = value
     def get_EIF_cond_exp_isfa_ista(self): return self.EIF_cond_exp_isfa_ista
     def set_EIF_cond_exp_isfa_ista(self, EIF_cond_exp_isfa_ista): self.EIF_cond_exp_isfa_ista = EIF_cond_exp_isfa_ista
     def add_EIF_cond_exp_isfa_ista(self, value): self.EIF_cond_exp_isfa_ista.append(value)
-    def insert_EIF_cond_exp_isfa_ista(self, index, value): self.EIF_cond_exp_isfa_ista[index] = value
+    def insert_EIF_cond_exp_isfa_ista_at(self, index, value): self.EIF_cond_exp_isfa_ista.insert(index, value)
+    def replace_EIF_cond_exp_isfa_ista_at(self, index, value): self.EIF_cond_exp_isfa_ista[index] = value
     def get_EIF_cond_alpha_isfa_ista(self): return self.EIF_cond_alpha_isfa_ista
     def set_EIF_cond_alpha_isfa_ista(self, EIF_cond_alpha_isfa_ista): self.EIF_cond_alpha_isfa_ista = EIF_cond_alpha_isfa_ista
     def add_EIF_cond_alpha_isfa_ista(self, value): self.EIF_cond_alpha_isfa_ista.append(value)
-    def insert_EIF_cond_alpha_isfa_ista(self, index, value): self.EIF_cond_alpha_isfa_ista[index] = value
+    def insert_EIF_cond_alpha_isfa_ista_at(self, index, value): self.EIF_cond_alpha_isfa_ista.insert(index, value)
+    def replace_EIF_cond_alpha_isfa_ista_at(self, index, value): self.EIF_cond_alpha_isfa_ista[index] = value
     def get_HH_cond_exp(self): return self.HH_cond_exp
     def set_HH_cond_exp(self, HH_cond_exp): self.HH_cond_exp = HH_cond_exp
     def add_HH_cond_exp(self, value): self.HH_cond_exp.append(value)
-    def insert_HH_cond_exp(self, index, value): self.HH_cond_exp[index] = value
+    def insert_HH_cond_exp_at(self, index, value): self.HH_cond_exp.insert(index, value)
+    def replace_HH_cond_exp_at(self, index, value): self.HH_cond_exp[index] = value
     def get_expCondSynapse(self): return self.expCondSynapse
     def set_expCondSynapse(self, expCondSynapse): self.expCondSynapse = expCondSynapse
     def add_expCondSynapse(self, value): self.expCondSynapse.append(value)
-    def insert_expCondSynapse(self, index, value): self.expCondSynapse[index] = value
+    def insert_expCondSynapse_at(self, index, value): self.expCondSynapse.insert(index, value)
+    def replace_expCondSynapse_at(self, index, value): self.expCondSynapse[index] = value
     def get_alphaCondSynapse(self): return self.alphaCondSynapse
     def set_alphaCondSynapse(self, alphaCondSynapse): self.alphaCondSynapse = alphaCondSynapse
     def add_alphaCondSynapse(self, value): self.alphaCondSynapse.append(value)
-    def insert_alphaCondSynapse(self, index, value): self.alphaCondSynapse[index] = value
+    def insert_alphaCondSynapse_at(self, index, value): self.alphaCondSynapse.insert(index, value)
+    def replace_alphaCondSynapse_at(self, index, value): self.alphaCondSynapse[index] = value
     def get_expCurrSynapse(self): return self.expCurrSynapse
     def set_expCurrSynapse(self, expCurrSynapse): self.expCurrSynapse = expCurrSynapse
     def add_expCurrSynapse(self, value): self.expCurrSynapse.append(value)
-    def insert_expCurrSynapse(self, index, value): self.expCurrSynapse[index] = value
+    def insert_expCurrSynapse_at(self, index, value): self.expCurrSynapse.insert(index, value)
+    def replace_expCurrSynapse_at(self, index, value): self.expCurrSynapse[index] = value
     def get_alphaCurrSynapse(self): return self.alphaCurrSynapse
     def set_alphaCurrSynapse(self, alphaCurrSynapse): self.alphaCurrSynapse = alphaCurrSynapse
     def add_alphaCurrSynapse(self, value): self.alphaCurrSynapse.append(value)
-    def insert_alphaCurrSynapse(self, index, value): self.alphaCurrSynapse[index] = value
+    def insert_alphaCurrSynapse_at(self, index, value): self.alphaCurrSynapse.insert(index, value)
+    def replace_alphaCurrSynapse_at(self, index, value): self.alphaCurrSynapse[index] = value
     def get_SpikeSourcePoisson(self): return self.SpikeSourcePoisson
     def set_SpikeSourcePoisson(self, SpikeSourcePoisson): self.SpikeSourcePoisson = SpikeSourcePoisson
     def add_SpikeSourcePoisson(self, value): self.SpikeSourcePoisson.append(value)
-    def insert_SpikeSourcePoisson(self, index, value): self.SpikeSourcePoisson[index] = value
+    def insert_SpikeSourcePoisson_at(self, index, value): self.SpikeSourcePoisson.insert(index, value)
+    def replace_SpikeSourcePoisson_at(self, index, value): self.SpikeSourcePoisson[index] = value
     def get_network(self): return self.network
     def set_network(self, network): self.network = network
     def add_network(self, value): self.network.append(value)
-    def insert_network(self, index, value): self.network[index] = value
+    def insert_network_at(self, index, value): self.network.insert(index, value)
+    def replace_network_at(self, index, value): self.network[index] = value
     def get_ComponentType(self): return self.ComponentType
     def set_ComponentType(self, ComponentType): self.ComponentType = ComponentType
     def add_ComponentType(self, value): self.ComponentType.append(value)
-    def insert_ComponentType(self, index, value): self.ComponentType[index] = value
+    def insert_ComponentType_at(self, index, value): self.ComponentType.insert(index, value)
+    def replace_ComponentType_at(self, index, value): self.ComponentType[index] = value
     def hasContent_(self):
         if (
             self.include or
@@ -10262,13 +9348,15 @@ class NeuroMLDocument(Standalone):
             eol_ = '\n'
         else:
             eol_ = ''
+        if self.original_tagname_ is not None:
+            name_ = self.original_tagname_
         showIndent(outfile, level, pretty_print)
         outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
         already_processed = set()
         self.exportAttributes(outfile, level, already_processed, namespace_, name_='NeuroMLDocument')
         if self.hasContent_():
             outfile.write('>%s' % (eol_, ))
-            self.exportChildren(outfile, level + 1, namespace_, name_, pretty_print=pretty_print)
+            self.exportChildren(outfile, level + 1, namespace_='', name_='NeuroMLDocument', pretty_print=pretty_print)
             showIndent(outfile, level, pretty_print)
             outfile.write('</%s%s>%s' % (namespace_, name_, eol_))
         else:
@@ -10361,502 +9449,13 @@ class NeuroMLDocument(Standalone):
             network_.export(outfile, level, namespace_, name_='network', pretty_print=pretty_print)
         for ComponentType_ in self.ComponentType:
             ComponentType_.export(outfile, level, namespace_, name_='ComponentType', pretty_print=pretty_print)
-    def exportLiteral(self, outfile, level, name_='NeuroMLDocument'):
-        level += 1
-        already_processed = set()
-        self.exportLiteralAttributes(outfile, level, already_processed, name_)
-        if self.hasContent_():
-            self.exportLiteralChildren(outfile, level, name_)
-    def exportLiteralAttributes(self, outfile, level, already_processed, name_):
-        super(NeuroMLDocument, self).exportLiteralAttributes(outfile, level, already_processed, name_)
-    def exportLiteralChildren(self, outfile, level, name_):
-        super(NeuroMLDocument, self).exportLiteralChildren(outfile, level, name_)
-        showIndent(outfile, level)
-        outfile.write('include=[\n')
-        level += 1
-        for include_ in self.include:
-            showIndent(outfile, level)
-            outfile.write('model_.IncludeType(\n')
-            include_.exportLiteral(outfile, level, name_='IncludeType')
-            showIndent(outfile, level)
-            outfile.write('),\n')
-        level -= 1
-        showIndent(outfile, level)
-        outfile.write('],\n')
-        showIndent(outfile, level)
-        outfile.write('extracellularProperties=[\n')
-        level += 1
-        for extracellularProperties_ in self.extracellularProperties:
-            showIndent(outfile, level)
-            outfile.write('model_.ExtracellularProperties(\n')
-            extracellularProperties_.exportLiteral(outfile, level, name_='ExtracellularProperties')
-            showIndent(outfile, level)
-            outfile.write('),\n')
-        level -= 1
-        showIndent(outfile, level)
-        outfile.write('],\n')
-        showIndent(outfile, level)
-        outfile.write('intracellularProperties=[\n')
-        level += 1
-        for intracellularProperties_ in self.intracellularProperties:
-            showIndent(outfile, level)
-            outfile.write('model_.IntracellularProperties(\n')
-            intracellularProperties_.exportLiteral(outfile, level, name_='IntracellularProperties')
-            showIndent(outfile, level)
-            outfile.write('),\n')
-        level -= 1
-        showIndent(outfile, level)
-        outfile.write('],\n')
-        showIndent(outfile, level)
-        outfile.write('morphology=[\n')
-        level += 1
-        for morphology_ in self.morphology:
-            showIndent(outfile, level)
-            outfile.write('model_.Morphology(\n')
-            morphology_.exportLiteral(outfile, level, name_='Morphology')
-            showIndent(outfile, level)
-            outfile.write('),\n')
-        level -= 1
-        showIndent(outfile, level)
-        outfile.write('],\n')
-        showIndent(outfile, level)
-        outfile.write('ionChannel=[\n')
-        level += 1
-        for ionChannel_ in self.ionChannel:
-            showIndent(outfile, level)
-            outfile.write('model_.IonChannel(\n')
-            ionChannel_.exportLiteral(outfile, level, name_='IonChannel')
-            showIndent(outfile, level)
-            outfile.write('),\n')
-        level -= 1
-        showIndent(outfile, level)
-        outfile.write('],\n')
-        showIndent(outfile, level)
-        outfile.write('decayingPoolConcentrationModel=[\n')
-        level += 1
-        for decayingPoolConcentrationModel_ in self.decayingPoolConcentrationModel:
-            showIndent(outfile, level)
-            outfile.write('model_.DecayingPoolConcentrationModel(\n')
-            decayingPoolConcentrationModel_.exportLiteral(outfile, level, name_='DecayingPoolConcentrationModel')
-            showIndent(outfile, level)
-            outfile.write('),\n')
-        level -= 1
-        showIndent(outfile, level)
-        outfile.write('],\n')
-        showIndent(outfile, level)
-        outfile.write('expOneSynapse=[\n')
-        level += 1
-        for expOneSynapse_ in self.expOneSynapse:
-            showIndent(outfile, level)
-            outfile.write('model_.ExpOneSynapse(\n')
-            expOneSynapse_.exportLiteral(outfile, level, name_='ExpOneSynapse')
-            showIndent(outfile, level)
-            outfile.write('),\n')
-        level -= 1
-        showIndent(outfile, level)
-        outfile.write('],\n')
-        showIndent(outfile, level)
-        outfile.write('expTwoSynapse=[\n')
-        level += 1
-        for expTwoSynapse_ in self.expTwoSynapse:
-            showIndent(outfile, level)
-            outfile.write('model_.ExpTwoSynapse(\n')
-            expTwoSynapse_.exportLiteral(outfile, level, name_='ExpTwoSynapse')
-            showIndent(outfile, level)
-            outfile.write('),\n')
-        level -= 1
-        showIndent(outfile, level)
-        outfile.write('],\n')
-        showIndent(outfile, level)
-        outfile.write('blockingPlasticSynapse=[\n')
-        level += 1
-        for blockingPlasticSynapse_ in self.blockingPlasticSynapse:
-            showIndent(outfile, level)
-            outfile.write('model_.BlockingPlasticSynapse(\n')
-            blockingPlasticSynapse_.exportLiteral(outfile, level, name_='BlockingPlasticSynapse')
-            showIndent(outfile, level)
-            outfile.write('),\n')
-        level -= 1
-        showIndent(outfile, level)
-        outfile.write('],\n')
-        showIndent(outfile, level)
-        outfile.write('biophysicalProperties=[\n')
-        level += 1
-        for biophysicalProperties_ in self.biophysicalProperties:
-            showIndent(outfile, level)
-            outfile.write('model_.BiophysicalProperties(\n')
-            biophysicalProperties_.exportLiteral(outfile, level, name_='BiophysicalProperties')
-            showIndent(outfile, level)
-            outfile.write('),\n')
-        level -= 1
-        showIndent(outfile, level)
-        outfile.write('],\n')
-        showIndent(outfile, level)
-        outfile.write('cell=[\n')
-        level += 1
-        for cell_ in self.cell:
-            showIndent(outfile, level)
-            outfile.write('model_.Cell(\n')
-            cell_.exportLiteral(outfile, level, name_='Cell')
-            showIndent(outfile, level)
-            outfile.write('),\n')
-        level -= 1
-        showIndent(outfile, level)
-        outfile.write('],\n')
-        showIndent(outfile, level)
-        outfile.write('baseCell=[\n')
-        level += 1
-        for baseCell_ in self.baseCell:
-            showIndent(outfile, level)
-            outfile.write('model_.BaseCell(\n')
-            baseCell_.exportLiteral(outfile, level, name_='BaseCell')
-            showIndent(outfile, level)
-            outfile.write('),\n')
-        level -= 1
-        showIndent(outfile, level)
-        outfile.write('],\n')
-        showIndent(outfile, level)
-        outfile.write('iafTauCell=[\n')
-        level += 1
-        for iafTauCell_ in self.iafTauCell:
-            showIndent(outfile, level)
-            outfile.write('model_.IaFTauCell(\n')
-            iafTauCell_.exportLiteral(outfile, level, name_='IaFTauCell')
-            showIndent(outfile, level)
-            outfile.write('),\n')
-        level -= 1
-        showIndent(outfile, level)
-        outfile.write('],\n')
-        showIndent(outfile, level)
-        outfile.write('iafTauRefCell=[\n')
-        level += 1
-        for iafTauRefCell_ in self.iafTauRefCell:
-            showIndent(outfile, level)
-            outfile.write('model_.IaFTauRefCell(\n')
-            iafTauRefCell_.exportLiteral(outfile, level, name_='IaFTauRefCell')
-            showIndent(outfile, level)
-            outfile.write('),\n')
-        level -= 1
-        showIndent(outfile, level)
-        outfile.write('],\n')
-        showIndent(outfile, level)
-        outfile.write('iafCell=[\n')
-        level += 1
-        for iafCell_ in self.iafCell:
-            showIndent(outfile, level)
-            outfile.write('model_.IaFCell(\n')
-            iafCell_.exportLiteral(outfile, level, name_='IaFCell')
-            showIndent(outfile, level)
-            outfile.write('),\n')
-        level -= 1
-        showIndent(outfile, level)
-        outfile.write('],\n')
-        showIndent(outfile, level)
-        outfile.write('iafRefCell=[\n')
-        level += 1
-        for iafRefCell_ in self.iafRefCell:
-            showIndent(outfile, level)
-            outfile.write('model_.IaFRefCell(\n')
-            iafRefCell_.exportLiteral(outfile, level, name_='IaFRefCell')
-            showIndent(outfile, level)
-            outfile.write('),\n')
-        level -= 1
-        showIndent(outfile, level)
-        outfile.write('],\n')
-        showIndent(outfile, level)
-        outfile.write('izhikevichCell=[\n')
-        level += 1
-        for izhikevichCell_ in self.izhikevichCell:
-            showIndent(outfile, level)
-            outfile.write('model_.IzhikevichCell(\n')
-            izhikevichCell_.exportLiteral(outfile, level, name_='IzhikevichCell')
-            showIndent(outfile, level)
-            outfile.write('),\n')
-        level -= 1
-        showIndent(outfile, level)
-        outfile.write('],\n')
-        showIndent(outfile, level)
-        outfile.write('adExIaFCell=[\n')
-        level += 1
-        for adExIaFCell_ in self.adExIaFCell:
-            showIndent(outfile, level)
-            outfile.write('model_.AdExIaFCell(\n')
-            adExIaFCell_.exportLiteral(outfile, level, name_='AdExIaFCell')
-            showIndent(outfile, level)
-            outfile.write('),\n')
-        level -= 1
-        showIndent(outfile, level)
-        outfile.write('],\n')
-        showIndent(outfile, level)
-        outfile.write('pulseGenerator=[\n')
-        level += 1
-        for pulseGenerator_ in self.pulseGenerator:
-            showIndent(outfile, level)
-            outfile.write('model_.PulseGenerator(\n')
-            pulseGenerator_.exportLiteral(outfile, level, name_='PulseGenerator')
-            showIndent(outfile, level)
-            outfile.write('),\n')
-        level -= 1
-        showIndent(outfile, level)
-        outfile.write('],\n')
-        showIndent(outfile, level)
-        outfile.write('sineGenerator=[\n')
-        level += 1
-        for sineGenerator_ in self.sineGenerator:
-            showIndent(outfile, level)
-            outfile.write('model_.SineGenerator(\n')
-            sineGenerator_.exportLiteral(outfile, level, name_='SineGenerator')
-            showIndent(outfile, level)
-            outfile.write('),\n')
-        level -= 1
-        showIndent(outfile, level)
-        outfile.write('],\n')
-        showIndent(outfile, level)
-        outfile.write('rampGenerator=[\n')
-        level += 1
-        for rampGenerator_ in self.rampGenerator:
-            showIndent(outfile, level)
-            outfile.write('model_.RampGenerator(\n')
-            rampGenerator_.exportLiteral(outfile, level, name_='RampGenerator')
-            showIndent(outfile, level)
-            outfile.write('),\n')
-        level -= 1
-        showIndent(outfile, level)
-        outfile.write('],\n')
-        showIndent(outfile, level)
-        outfile.write('voltageClamp=[\n')
-        level += 1
-        for voltageClamp_ in self.voltageClamp:
-            showIndent(outfile, level)
-            outfile.write('model_.VoltageClamp(\n')
-            voltageClamp_.exportLiteral(outfile, level, name_='VoltageClamp')
-            showIndent(outfile, level)
-            outfile.write('),\n')
-        level -= 1
-        showIndent(outfile, level)
-        outfile.write('],\n')
-        showIndent(outfile, level)
-        outfile.write('spikeArray=[\n')
-        level += 1
-        for spikeArray_ in self.spikeArray:
-            showIndent(outfile, level)
-            outfile.write('model_.SpikeArray(\n')
-            spikeArray_.exportLiteral(outfile, level, name_='SpikeArray')
-            showIndent(outfile, level)
-            outfile.write('),\n')
-        level -= 1
-        showIndent(outfile, level)
-        outfile.write('],\n')
-        showIndent(outfile, level)
-        outfile.write('spikeGenerator=[\n')
-        level += 1
-        for spikeGenerator_ in self.spikeGenerator:
-            showIndent(outfile, level)
-            outfile.write('model_.SpikeGenerator(\n')
-            spikeGenerator_.exportLiteral(outfile, level, name_='SpikeGenerator')
-            showIndent(outfile, level)
-            outfile.write('),\n')
-        level -= 1
-        showIndent(outfile, level)
-        outfile.write('],\n')
-        showIndent(outfile, level)
-        outfile.write('spikeGeneratorRandom=[\n')
-        level += 1
-        for spikeGeneratorRandom_ in self.spikeGeneratorRandom:
-            showIndent(outfile, level)
-            outfile.write('model_.SpikeGeneratorRandom(\n')
-            spikeGeneratorRandom_.exportLiteral(outfile, level, name_='SpikeGeneratorRandom')
-            showIndent(outfile, level)
-            outfile.write('),\n')
-        level -= 1
-        showIndent(outfile, level)
-        outfile.write('],\n')
-        showIndent(outfile, level)
-        outfile.write('spikeGeneratorPoisson=[\n')
-        level += 1
-        for spikeGeneratorPoisson_ in self.spikeGeneratorPoisson:
-            showIndent(outfile, level)
-            outfile.write('model_.SpikeGeneratorPoisson(\n')
-            spikeGeneratorPoisson_.exportLiteral(outfile, level, name_='SpikeGeneratorPoisson')
-            showIndent(outfile, level)
-            outfile.write('),\n')
-        level -= 1
-        showIndent(outfile, level)
-        outfile.write('],\n')
-        showIndent(outfile, level)
-        outfile.write('IF_curr_alpha=[\n')
-        level += 1
-        for IF_curr_alpha_ in self.IF_curr_alpha:
-            showIndent(outfile, level)
-            outfile.write('model_.IF_curr_alpha(\n')
-            IF_curr_alpha_.exportLiteral(outfile, level)
-            showIndent(outfile, level)
-            outfile.write('),\n')
-        level -= 1
-        showIndent(outfile, level)
-        outfile.write('],\n')
-        showIndent(outfile, level)
-        outfile.write('IF_curr_exp=[\n')
-        level += 1
-        for IF_curr_exp_ in self.IF_curr_exp:
-            showIndent(outfile, level)
-            outfile.write('model_.IF_curr_exp(\n')
-            IF_curr_exp_.exportLiteral(outfile, level)
-            showIndent(outfile, level)
-            outfile.write('),\n')
-        level -= 1
-        showIndent(outfile, level)
-        outfile.write('],\n')
-        showIndent(outfile, level)
-        outfile.write('IF_cond_alpha=[\n')
-        level += 1
-        for IF_cond_alpha_ in self.IF_cond_alpha:
-            showIndent(outfile, level)
-            outfile.write('model_.IF_cond_alpha(\n')
-            IF_cond_alpha_.exportLiteral(outfile, level)
-            showIndent(outfile, level)
-            outfile.write('),\n')
-        level -= 1
-        showIndent(outfile, level)
-        outfile.write('],\n')
-        showIndent(outfile, level)
-        outfile.write('IF_cond_exp=[\n')
-        level += 1
-        for IF_cond_exp_ in self.IF_cond_exp:
-            showIndent(outfile, level)
-            outfile.write('model_.IF_cond_exp(\n')
-            IF_cond_exp_.exportLiteral(outfile, level)
-            showIndent(outfile, level)
-            outfile.write('),\n')
-        level -= 1
-        showIndent(outfile, level)
-        outfile.write('],\n')
-        showIndent(outfile, level)
-        outfile.write('EIF_cond_exp_isfa_ista=[\n')
-        level += 1
-        for EIF_cond_exp_isfa_ista_ in self.EIF_cond_exp_isfa_ista:
-            showIndent(outfile, level)
-            outfile.write('model_.EIF_cond_exp_isfa_ista(\n')
-            EIF_cond_exp_isfa_ista_.exportLiteral(outfile, level)
-            showIndent(outfile, level)
-            outfile.write('),\n')
-        level -= 1
-        showIndent(outfile, level)
-        outfile.write('],\n')
-        showIndent(outfile, level)
-        outfile.write('EIF_cond_alpha_isfa_ista=[\n')
-        level += 1
-        for EIF_cond_alpha_isfa_ista_ in self.EIF_cond_alpha_isfa_ista:
-            showIndent(outfile, level)
-            outfile.write('model_.EIF_cond_alpha_isfa_ista(\n')
-            EIF_cond_alpha_isfa_ista_.exportLiteral(outfile, level)
-            showIndent(outfile, level)
-            outfile.write('),\n')
-        level -= 1
-        showIndent(outfile, level)
-        outfile.write('],\n')
-        showIndent(outfile, level)
-        outfile.write('HH_cond_exp=[\n')
-        level += 1
-        for HH_cond_exp_ in self.HH_cond_exp:
-            showIndent(outfile, level)
-            outfile.write('model_.HH_cond_exp(\n')
-            HH_cond_exp_.exportLiteral(outfile, level)
-            showIndent(outfile, level)
-            outfile.write('),\n')
-        level -= 1
-        showIndent(outfile, level)
-        outfile.write('],\n')
-        showIndent(outfile, level)
-        outfile.write('expCondSynapse=[\n')
-        level += 1
-        for expCondSynapse_ in self.expCondSynapse:
-            showIndent(outfile, level)
-            outfile.write('model_.ExpCondSynapse(\n')
-            expCondSynapse_.exportLiteral(outfile, level, name_='ExpCondSynapse')
-            showIndent(outfile, level)
-            outfile.write('),\n')
-        level -= 1
-        showIndent(outfile, level)
-        outfile.write('],\n')
-        showIndent(outfile, level)
-        outfile.write('alphaCondSynapse=[\n')
-        level += 1
-        for alphaCondSynapse_ in self.alphaCondSynapse:
-            showIndent(outfile, level)
-            outfile.write('model_.AlphaCondSynapse(\n')
-            alphaCondSynapse_.exportLiteral(outfile, level, name_='AlphaCondSynapse')
-            showIndent(outfile, level)
-            outfile.write('),\n')
-        level -= 1
-        showIndent(outfile, level)
-        outfile.write('],\n')
-        showIndent(outfile, level)
-        outfile.write('expCurrSynapse=[\n')
-        level += 1
-        for expCurrSynapse_ in self.expCurrSynapse:
-            showIndent(outfile, level)
-            outfile.write('model_.ExpCurrSynapse(\n')
-            expCurrSynapse_.exportLiteral(outfile, level, name_='ExpCurrSynapse')
-            showIndent(outfile, level)
-            outfile.write('),\n')
-        level -= 1
-        showIndent(outfile, level)
-        outfile.write('],\n')
-        showIndent(outfile, level)
-        outfile.write('alphaCurrSynapse=[\n')
-        level += 1
-        for alphaCurrSynapse_ in self.alphaCurrSynapse:
-            showIndent(outfile, level)
-            outfile.write('model_.AlphaCurrSynapse(\n')
-            alphaCurrSynapse_.exportLiteral(outfile, level, name_='AlphaCurrSynapse')
-            showIndent(outfile, level)
-            outfile.write('),\n')
-        level -= 1
-        showIndent(outfile, level)
-        outfile.write('],\n')
-        showIndent(outfile, level)
-        outfile.write('SpikeSourcePoisson=[\n')
-        level += 1
-        for SpikeSourcePoisson_ in self.SpikeSourcePoisson:
-            showIndent(outfile, level)
-            outfile.write('model_.SpikeSourcePoisson(\n')
-            SpikeSourcePoisson_.exportLiteral(outfile, level)
-            showIndent(outfile, level)
-            outfile.write('),\n')
-        level -= 1
-        showIndent(outfile, level)
-        outfile.write('],\n')
-        showIndent(outfile, level)
-        outfile.write('network=[\n')
-        level += 1
-        for network_ in self.network:
-            showIndent(outfile, level)
-            outfile.write('model_.Network(\n')
-            network_.exportLiteral(outfile, level, name_='Network')
-            showIndent(outfile, level)
-            outfile.write('),\n')
-        level -= 1
-        showIndent(outfile, level)
-        outfile.write('],\n')
-        showIndent(outfile, level)
-        outfile.write('ComponentType=[\n')
-        level += 1
-        for ComponentType_ in self.ComponentType:
-            showIndent(outfile, level)
-            outfile.write('model_.ComponentType(\n')
-            ComponentType_.exportLiteral(outfile, level)
-            showIndent(outfile, level)
-            outfile.write('),\n')
-        level -= 1
-        showIndent(outfile, level)
-        outfile.write('],\n')
     def build(self, node):
         already_processed = set()
         self.buildAttributes(node, node.attrib, already_processed)
         for child in node:
             nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
             self.buildChildren(child, node, nodeName_)
+        return self
     def buildAttributes(self, node, attrs, already_processed):
         super(NeuroMLDocument, self).buildAttributes(node, attrs, already_processed)
     def buildChildren(self, child_, node, nodeName_, fromsubclass_=False):
@@ -10864,167 +9463,207 @@ class NeuroMLDocument(Standalone):
             obj_ = IncludeType.factory()
             obj_.build(child_)
             self.include.append(obj_)
+            obj_.original_tagname_ = 'include'
         elif nodeName_ == 'extracellularProperties':
             obj_ = ExtracellularProperties.factory()
             obj_.build(child_)
             self.extracellularProperties.append(obj_)
+            obj_.original_tagname_ = 'extracellularProperties'
         elif nodeName_ == 'intracellularProperties':
             obj_ = IntracellularProperties.factory()
             obj_.build(child_)
             self.intracellularProperties.append(obj_)
+            obj_.original_tagname_ = 'intracellularProperties'
         elif nodeName_ == 'morphology':
             obj_ = Morphology.factory()
             obj_.build(child_)
             self.morphology.append(obj_)
+            obj_.original_tagname_ = 'morphology'
         elif nodeName_ == 'ionChannel':
             obj_ = IonChannel.factory()
             obj_.build(child_)
             self.ionChannel.append(obj_)
+            obj_.original_tagname_ = 'ionChannel'
         elif nodeName_ == 'decayingPoolConcentrationModel':
             class_obj_ = self.get_class_obj_(child_, DecayingPoolConcentrationModel)
             obj_ = class_obj_.factory()
             obj_.build(child_)
             self.decayingPoolConcentrationModel.append(obj_)
+            obj_.original_tagname_ = 'decayingPoolConcentrationModel'
         elif nodeName_ == 'expOneSynapse':
             obj_ = ExpOneSynapse.factory()
             obj_.build(child_)
             self.expOneSynapse.append(obj_)
+            obj_.original_tagname_ = 'expOneSynapse'
         elif nodeName_ == 'expTwoSynapse':
             class_obj_ = self.get_class_obj_(child_, ExpTwoSynapse)
             obj_ = class_obj_.factory()
             obj_.build(child_)
             self.expTwoSynapse.append(obj_)
+            obj_.original_tagname_ = 'expTwoSynapse'
         elif nodeName_ == 'blockingPlasticSynapse':
             obj_ = BlockingPlasticSynapse.factory()
             obj_.build(child_)
             self.blockingPlasticSynapse.append(obj_)
+            obj_.original_tagname_ = 'blockingPlasticSynapse'
         elif nodeName_ == 'biophysicalProperties':
             obj_ = BiophysicalProperties.factory()
             obj_.build(child_)
             self.biophysicalProperties.append(obj_)
+            obj_.original_tagname_ = 'biophysicalProperties'
         elif nodeName_ == 'cell':
             obj_ = Cell.factory()
             obj_.build(child_)
             self.cell.append(obj_)
+            obj_.original_tagname_ = 'cell'
         elif nodeName_ == 'baseCell':
             class_obj_ = self.get_class_obj_(child_, BaseCell)
             obj_ = class_obj_.factory()
             obj_.build(child_)
             self.baseCell.append(obj_)
+            obj_.original_tagname_ = 'baseCell'
         elif nodeName_ == 'iafTauCell':
             class_obj_ = self.get_class_obj_(child_, IaFTauCell)
             obj_ = class_obj_.factory()
             obj_.build(child_)
             self.iafTauCell.append(obj_)
+            obj_.original_tagname_ = 'iafTauCell'
         elif nodeName_ == 'iafTauRefCell':
             obj_ = IaFTauRefCell.factory()
             obj_.build(child_)
             self.iafTauRefCell.append(obj_)
+            obj_.original_tagname_ = 'iafTauRefCell'
         elif nodeName_ == 'iafCell':
             class_obj_ = self.get_class_obj_(child_, IaFCell)
             obj_ = class_obj_.factory()
             obj_.build(child_)
             self.iafCell.append(obj_)
+            obj_.original_tagname_ = 'iafCell'
         elif nodeName_ == 'iafRefCell':
             obj_ = IaFRefCell.factory()
             obj_.build(child_)
             self.iafRefCell.append(obj_)
+            obj_.original_tagname_ = 'iafRefCell'
         elif nodeName_ == 'izhikevichCell':
             obj_ = IzhikevichCell.factory()
             obj_.build(child_)
             self.izhikevichCell.append(obj_)
+            obj_.original_tagname_ = 'izhikevichCell'
         elif nodeName_ == 'adExIaFCell':
             obj_ = AdExIaFCell.factory()
             obj_.build(child_)
             self.adExIaFCell.append(obj_)
+            obj_.original_tagname_ = 'adExIaFCell'
         elif nodeName_ == 'pulseGenerator':
             obj_ = PulseGenerator.factory()
             obj_.build(child_)
             self.pulseGenerator.append(obj_)
+            obj_.original_tagname_ = 'pulseGenerator'
         elif nodeName_ == 'sineGenerator':
             obj_ = SineGenerator.factory()
             obj_.build(child_)
             self.sineGenerator.append(obj_)
+            obj_.original_tagname_ = 'sineGenerator'
         elif nodeName_ == 'rampGenerator':
             obj_ = RampGenerator.factory()
             obj_.build(child_)
             self.rampGenerator.append(obj_)
+            obj_.original_tagname_ = 'rampGenerator'
         elif nodeName_ == 'voltageClamp':
             obj_ = VoltageClamp.factory()
             obj_.build(child_)
             self.voltageClamp.append(obj_)
+            obj_.original_tagname_ = 'voltageClamp'
         elif nodeName_ == 'spikeArray':
             obj_ = SpikeArray.factory()
             obj_.build(child_)
             self.spikeArray.append(obj_)
+            obj_.original_tagname_ = 'spikeArray'
         elif nodeName_ == 'spikeGenerator':
             obj_ = SpikeGenerator.factory()
             obj_.build(child_)
             self.spikeGenerator.append(obj_)
+            obj_.original_tagname_ = 'spikeGenerator'
         elif nodeName_ == 'spikeGeneratorRandom':
             obj_ = SpikeGeneratorRandom.factory()
             obj_.build(child_)
             self.spikeGeneratorRandom.append(obj_)
+            obj_.original_tagname_ = 'spikeGeneratorRandom'
         elif nodeName_ == 'spikeGeneratorPoisson':
             obj_ = SpikeGeneratorPoisson.factory()
             obj_.build(child_)
             self.spikeGeneratorPoisson.append(obj_)
+            obj_.original_tagname_ = 'spikeGeneratorPoisson'
         elif nodeName_ == 'IF_curr_alpha':
             obj_ = IF_curr_alpha.factory()
             obj_.build(child_)
             self.IF_curr_alpha.append(obj_)
+            obj_.original_tagname_ = 'IF_curr_alpha'
         elif nodeName_ == 'IF_curr_exp':
             obj_ = IF_curr_exp.factory()
             obj_.build(child_)
             self.IF_curr_exp.append(obj_)
+            obj_.original_tagname_ = 'IF_curr_exp'
         elif nodeName_ == 'IF_cond_alpha':
             obj_ = IF_cond_alpha.factory()
             obj_.build(child_)
             self.IF_cond_alpha.append(obj_)
+            obj_.original_tagname_ = 'IF_cond_alpha'
         elif nodeName_ == 'IF_cond_exp':
             obj_ = IF_cond_exp.factory()
             obj_.build(child_)
             self.IF_cond_exp.append(obj_)
+            obj_.original_tagname_ = 'IF_cond_exp'
         elif nodeName_ == 'EIF_cond_exp_isfa_ista':
             obj_ = EIF_cond_exp_isfa_ista.factory()
             obj_.build(child_)
             self.EIF_cond_exp_isfa_ista.append(obj_)
+            obj_.original_tagname_ = 'EIF_cond_exp_isfa_ista'
         elif nodeName_ == 'EIF_cond_alpha_isfa_ista':
             obj_ = EIF_cond_alpha_isfa_ista.factory()
             obj_.build(child_)
             self.EIF_cond_alpha_isfa_ista.append(obj_)
+            obj_.original_tagname_ = 'EIF_cond_alpha_isfa_ista'
         elif nodeName_ == 'HH_cond_exp':
             obj_ = HH_cond_exp.factory()
             obj_.build(child_)
             self.HH_cond_exp.append(obj_)
+            obj_.original_tagname_ = 'HH_cond_exp'
         elif nodeName_ == 'expCondSynapse':
             obj_ = ExpCondSynapse.factory()
             obj_.build(child_)
             self.expCondSynapse.append(obj_)
+            obj_.original_tagname_ = 'expCondSynapse'
         elif nodeName_ == 'alphaCondSynapse':
             obj_ = AlphaCondSynapse.factory()
             obj_.build(child_)
             self.alphaCondSynapse.append(obj_)
+            obj_.original_tagname_ = 'alphaCondSynapse'
         elif nodeName_ == 'expCurrSynapse':
             obj_ = ExpCurrSynapse.factory()
             obj_.build(child_)
             self.expCurrSynapse.append(obj_)
+            obj_.original_tagname_ = 'expCurrSynapse'
         elif nodeName_ == 'alphaCurrSynapse':
             obj_ = AlphaCurrSynapse.factory()
             obj_.build(child_)
             self.alphaCurrSynapse.append(obj_)
+            obj_.original_tagname_ = 'alphaCurrSynapse'
         elif nodeName_ == 'SpikeSourcePoisson':
             obj_ = SpikeSourcePoisson.factory()
             obj_.build(child_)
             self.SpikeSourcePoisson.append(obj_)
+            obj_.original_tagname_ = 'SpikeSourcePoisson'
         elif nodeName_ == 'network':
             obj_ = Network.factory()
             obj_.build(child_)
             self.network.append(obj_)
+            obj_.original_tagname_ = 'network'
         elif nodeName_ == 'ComponentType':
             obj_ = ComponentType.factory()
             obj_.build(child_)
             self.ComponentType.append(obj_)
+            obj_.original_tagname_ = 'ComponentType'
         super(NeuroMLDocument, self).buildChildren(child_, node, nodeName_, True)
 # end class NeuroMLDocument
 
@@ -11033,10 +9672,16 @@ class BasePynnSynapse(BaseSynapse):
     subclass = None
     superclass = BaseSynapse
     def __init__(self, id=None, neuroLexId=None, name=None, metaid=None, notes=None, annotation=None, tau_syn=None, extensiontype_=None):
+        self.original_tagname_ = None
         super(BasePynnSynapse, self).__init__(id, neuroLexId, name, metaid, notes, annotation, extensiontype_, )
         self.tau_syn = _cast(float, tau_syn)
         self.extensiontype_ = extensiontype_
     def factory(*args_, **kwargs_):
+        if CurrentSubclassModule_ is not None:
+            subclass = getSubclassFromModule_(
+                CurrentSubclassModule_, BasePynnSynapse)
+            if subclass is not None:
+                return subclass(*args_, **kwargs_)
         if BasePynnSynapse.subclass:
             return BasePynnSynapse.subclass(*args_, **kwargs_)
         else:
@@ -11058,13 +9703,15 @@ class BasePynnSynapse(BaseSynapse):
             eol_ = '\n'
         else:
             eol_ = ''
+        if self.original_tagname_ is not None:
+            name_ = self.original_tagname_
         showIndent(outfile, level, pretty_print)
         outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
         already_processed = set()
         self.exportAttributes(outfile, level, already_processed, namespace_, name_='BasePynnSynapse')
         if self.hasContent_():
             outfile.write('>%s' % (eol_, ))
-            self.exportChildren(outfile, level + 1, namespace_, name_, pretty_print=pretty_print)
+            self.exportChildren(outfile, level + 1, namespace_='', name_='BasePynnSynapse', pretty_print=pretty_print)
             showIndent(outfile, level, pretty_print)
             outfile.write('</%s%s>%s' % (namespace_, name_, eol_))
         else:
@@ -11080,26 +9727,13 @@ class BasePynnSynapse(BaseSynapse):
             outfile.write(' xsi:type="%s"' % self.extensiontype_)
     def exportChildren(self, outfile, level, namespace_='', name_='BasePynnSynapse', fromsubclass_=False, pretty_print=True):
         super(BasePynnSynapse, self).exportChildren(outfile, level, namespace_, name_, True, pretty_print=pretty_print)
-    def exportLiteral(self, outfile, level, name_='BasePynnSynapse'):
-        level += 1
-        already_processed = set()
-        self.exportLiteralAttributes(outfile, level, already_processed, name_)
-        if self.hasContent_():
-            self.exportLiteralChildren(outfile, level, name_)
-    def exportLiteralAttributes(self, outfile, level, already_processed, name_):
-        if self.tau_syn is not None and 'tau_syn' not in already_processed:
-            already_processed.add('tau_syn')
-            showIndent(outfile, level)
-            outfile.write('tau_syn=%e,\n' % (self.tau_syn,))
-        super(BasePynnSynapse, self).exportLiteralAttributes(outfile, level, already_processed, name_)
-    def exportLiteralChildren(self, outfile, level, name_):
-        super(BasePynnSynapse, self).exportLiteralChildren(outfile, level, name_)
     def build(self, node):
         already_processed = set()
         self.buildAttributes(node, node.attrib, already_processed)
         for child in node:
             nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
             self.buildChildren(child, node, nodeName_)
+        return self
     def buildAttributes(self, node, attrs, already_processed):
         value = find_attr_value_('tau_syn', node)
         if value is not None and 'tau_syn' not in already_processed:
@@ -11122,33 +9756,39 @@ class BasePynnSynapse(BaseSynapse):
 class basePyNNCell(BaseCell):
     subclass = None
     superclass = BaseCell
-    def __init__(self, id=None, neuroLexId=None, name=None, metaid=None, notes=None, annotation=None, tau_syn_I=None, tau_syn_E=None, i_offset=None, cm=None, v_init=None, extensiontype_=None):
+    def __init__(self, id=None, neuroLexId=None, name=None, metaid=None, notes=None, annotation=None, cm=None, i_offset=None, tau_syn_E=None, tau_syn_I=None, v_init=None, extensiontype_=None):
+        self.original_tagname_ = None
         super(basePyNNCell, self).__init__(id, neuroLexId, name, metaid, notes, annotation, extensiontype_, )
-        self.tau_syn_I = _cast(float, tau_syn_I)
-        self.tau_syn_E = _cast(float, tau_syn_E)
-        self.i_offset = _cast(float, i_offset)
         self.cm = _cast(float, cm)
+        self.i_offset = _cast(float, i_offset)
+        self.tau_syn_E = _cast(float, tau_syn_E)
+        self.tau_syn_I = _cast(float, tau_syn_I)
         self.v_init = _cast(float, v_init)
         self.extensiontype_ = extensiontype_
     def factory(*args_, **kwargs_):
+        if CurrentSubclassModule_ is not None:
+            subclass = getSubclassFromModule_(
+                CurrentSubclassModule_, basePyNNCell)
+            if subclass is not None:
+                return subclass(*args_, **kwargs_)
         if basePyNNCell.subclass:
             return basePyNNCell.subclass(*args_, **kwargs_)
         else:
             return basePyNNCell(*args_, **kwargs_)
     factory = staticmethod(factory)
-    def get_tau_syn_I(self): return self.tau_syn_I
-    def set_tau_syn_I(self, tau_syn_I): self.tau_syn_I = tau_syn_I
-    def get_tau_syn_E(self): return self.tau_syn_E
-    def set_tau_syn_E(self, tau_syn_E): self.tau_syn_E = tau_syn_E
-    def get_i_offset(self): return self.i_offset
-    def set_i_offset(self, i_offset): self.i_offset = i_offset
     def get_cm(self): return self.cm
     def set_cm(self, cm): self.cm = cm
-    def get_v_init(self): return self.v_init
-    def set_v_init(self, v_init): self.v_init = v_init
-    def get_extensiontype_(self): return self.extensiontype_
-    def set_extensiontype_(self, extensiontype_): self.extensiontype_ = extensiontype_
-    def hasContent_(self):
+    def get_i_offset(self): return self.i_offset
+    def set_i_offset(self, i_offset): self.i_offset = i_offset
+    def get_tau_syn_E(self): return self.tau_syn_E
+    def set_tau_syn_E(self, tau_syn_E): self.tau_syn_E = tau_syn_E
+    def get_tau_syn_I(self): return self.tau_syn_I
+    def set_tau_syn_I(self, tau_syn_I): self.tau_syn_I = tau_syn_I
+    def get_v_init(self): return self.v_init
+    def set_v_init(self, v_init): self.v_init = v_init
+    def get_extensiontype_(self): return self.extensiontype_
+    def set_extensiontype_(self, extensiontype_): self.extensiontype_ = extensiontype_
+    def hasContent_(self):
         if (
             super(basePyNNCell, self).hasContent_()
         ):
@@ -11160,31 +9800,33 @@ class basePyNNCell(BaseCell):
             eol_ = '\n'
         else:
             eol_ = ''
+        if self.original_tagname_ is not None:
+            name_ = self.original_tagname_
         showIndent(outfile, level, pretty_print)
         outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
         already_processed = set()
         self.exportAttributes(outfile, level, already_processed, namespace_, name_='basePyNNCell')
         if self.hasContent_():
             outfile.write('>%s' % (eol_, ))
-            self.exportChildren(outfile, level + 1, namespace_, name_, pretty_print=pretty_print)
+            self.exportChildren(outfile, level + 1, namespace_='', name_='basePyNNCell', pretty_print=pretty_print)
             showIndent(outfile, level, pretty_print)
             outfile.write('</%s%s>%s' % (namespace_, name_, eol_))
         else:
             outfile.write('/>%s' % (eol_, ))
     def exportAttributes(self, outfile, level, already_processed, namespace_='', name_='basePyNNCell'):
         super(basePyNNCell, self).exportAttributes(outfile, level, already_processed, namespace_, name_='basePyNNCell')
-        if self.tau_syn_I is not None and 'tau_syn_I' not in already_processed:
-            already_processed.add('tau_syn_I')
-            outfile.write(' tau_syn_I="%s"' % self.gds_format_double(self.tau_syn_I, input_name='tau_syn_I'))
-        if self.tau_syn_E is not None and 'tau_syn_E' not in already_processed:
-            already_processed.add('tau_syn_E')
-            outfile.write(' tau_syn_E="%s"' % self.gds_format_double(self.tau_syn_E, input_name='tau_syn_E'))
-        if self.i_offset is not None and 'i_offset' not in already_processed:
-            already_processed.add('i_offset')
-            outfile.write(' i_offset="%s"' % self.gds_format_double(self.i_offset, input_name='i_offset'))
         if self.cm is not None and 'cm' not in already_processed:
             already_processed.add('cm')
             outfile.write(' cm="%s"' % self.gds_format_double(self.cm, input_name='cm'))
+        if self.i_offset is not None and 'i_offset' not in already_processed:
+            already_processed.add('i_offset')
+            outfile.write(' i_offset="%s"' % self.gds_format_double(self.i_offset, input_name='i_offset'))
+        if self.tau_syn_E is not None and 'tau_syn_E' not in already_processed:
+            already_processed.add('tau_syn_E')
+            outfile.write(' tau_syn_E="%s"' % self.gds_format_double(self.tau_syn_E, input_name='tau_syn_E'))
+        if self.tau_syn_I is not None and 'tau_syn_I' not in already_processed:
+            already_processed.add('tau_syn_I')
+            outfile.write(' tau_syn_I="%s"' % self.gds_format_double(self.tau_syn_I, input_name='tau_syn_I'))
         if self.v_init is not None and 'v_init' not in already_processed:
             already_processed.add('v_init')
             outfile.write(' v_init="%s"' % self.gds_format_double(self.v_init, input_name='v_init'))
@@ -11194,57 +9836,21 @@ class basePyNNCell(BaseCell):
             outfile.write(' xsi:type="%s"' % self.extensiontype_)
     def exportChildren(self, outfile, level, namespace_='', name_='basePyNNCell', fromsubclass_=False, pretty_print=True):
         super(basePyNNCell, self).exportChildren(outfile, level, namespace_, name_, True, pretty_print=pretty_print)
-    def exportLiteral(self, outfile, level, name_='basePyNNCell'):
-        level += 1
-        already_processed = set()
-        self.exportLiteralAttributes(outfile, level, already_processed, name_)
-        if self.hasContent_():
-            self.exportLiteralChildren(outfile, level, name_)
-    def exportLiteralAttributes(self, outfile, level, already_processed, name_):
-        if self.tau_syn_I is not None and 'tau_syn_I' not in already_processed:
-            already_processed.add('tau_syn_I')
-            showIndent(outfile, level)
-            outfile.write('tau_syn_I=%e,\n' % (self.tau_syn_I,))
-        if self.tau_syn_E is not None and 'tau_syn_E' not in already_processed:
-            already_processed.add('tau_syn_E')
-            showIndent(outfile, level)
-            outfile.write('tau_syn_E=%e,\n' % (self.tau_syn_E,))
-        if self.i_offset is not None and 'i_offset' not in already_processed:
-            already_processed.add('i_offset')
-            showIndent(outfile, level)
-            outfile.write('i_offset=%e,\n' % (self.i_offset,))
-        if self.cm is not None and 'cm' not in already_processed:
-            already_processed.add('cm')
-            showIndent(outfile, level)
-            outfile.write('cm=%e,\n' % (self.cm,))
-        if self.v_init is not None and 'v_init' not in already_processed:
-            already_processed.add('v_init')
-            showIndent(outfile, level)
-            outfile.write('v_init=%e,\n' % (self.v_init,))
-        super(basePyNNCell, self).exportLiteralAttributes(outfile, level, already_processed, name_)
-    def exportLiteralChildren(self, outfile, level, name_):
-        super(basePyNNCell, self).exportLiteralChildren(outfile, level, name_)
     def build(self, node):
         already_processed = set()
         self.buildAttributes(node, node.attrib, already_processed)
         for child in node:
             nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
             self.buildChildren(child, node, nodeName_)
+        return self
     def buildAttributes(self, node, attrs, already_processed):
-        value = find_attr_value_('tau_syn_I', node)
-        if value is not None and 'tau_syn_I' not in already_processed:
-            already_processed.add('tau_syn_I')
-            try:
-                self.tau_syn_I = float(value)
-            except ValueError as exp:
-                raise ValueError('Bad float/double attribute (tau_syn_I): %s' % exp)
-        value = find_attr_value_('tau_syn_E', node)
-        if value is not None and 'tau_syn_E' not in already_processed:
-            already_processed.add('tau_syn_E')
+        value = find_attr_value_('cm', node)
+        if value is not None and 'cm' not in already_processed:
+            already_processed.add('cm')
             try:
-                self.tau_syn_E = float(value)
+                self.cm = float(value)
             except ValueError as exp:
-                raise ValueError('Bad float/double attribute (tau_syn_E): %s' % exp)
+                raise ValueError('Bad float/double attribute (cm): %s' % exp)
         value = find_attr_value_('i_offset', node)
         if value is not None and 'i_offset' not in already_processed:
             already_processed.add('i_offset')
@@ -11252,13 +9858,20 @@ class basePyNNCell(BaseCell):
                 self.i_offset = float(value)
             except ValueError as exp:
                 raise ValueError('Bad float/double attribute (i_offset): %s' % exp)
-        value = find_attr_value_('cm', node)
-        if value is not None and 'cm' not in already_processed:
-            already_processed.add('cm')
+        value = find_attr_value_('tau_syn_E', node)
+        if value is not None and 'tau_syn_E' not in already_processed:
+            already_processed.add('tau_syn_E')
             try:
-                self.cm = float(value)
+                self.tau_syn_E = float(value)
             except ValueError as exp:
-                raise ValueError('Bad float/double attribute (cm): %s' % exp)
+                raise ValueError('Bad float/double attribute (tau_syn_E): %s' % exp)
+        value = find_attr_value_('tau_syn_I', node)
+        if value is not None and 'tau_syn_I' not in already_processed:
+            already_processed.add('tau_syn_I')
+            try:
+                self.tau_syn_I = float(value)
+            except ValueError as exp:
+                raise ValueError('Bad float/double attribute (tau_syn_I): %s' % exp)
         value = find_attr_value_('v_init', node)
         if value is not None and 'v_init' not in already_processed:
             already_processed.add('v_init')
@@ -11280,11 +9893,16 @@ class basePyNNCell(BaseCell):
 class ConcentrationModel_D(DecayingPoolConcentrationModel):
     subclass = None
     superclass = DecayingPoolConcentrationModel
-    def __init__(self, id=None, neuroLexId=None, name=None, metaid=None, notes=None, annotation=None, ion=None, shellThickness=None, restingConc=None, decayConstant=None, type_=None):
-        super(ConcentrationModel_D, self).__init__(id, neuroLexId, name, metaid, notes, annotation, ion, shellThickness, restingConc, decayConstant, )
+    def __init__(self, id=None, neuroLexId=None, name=None, metaid=None, notes=None, annotation=None, ion=None, restingConc=None, decayConstant=None, shellThickness=None, type_=None):
+        self.original_tagname_ = None
+        super(ConcentrationModel_D, self).__init__(id, neuroLexId, name, metaid, notes, annotation, ion, restingConc, decayConstant, shellThickness, )
         self.type_ = _cast(None, type_)
-        pass
     def factory(*args_, **kwargs_):
+        if CurrentSubclassModule_ is not None:
+            subclass = getSubclassFromModule_(
+                CurrentSubclassModule_, ConcentrationModel_D)
+            if subclass is not None:
+                return subclass(*args_, **kwargs_)
         if ConcentrationModel_D.subclass:
             return ConcentrationModel_D.subclass(*args_, **kwargs_)
         else:
@@ -11304,13 +9922,15 @@ class ConcentrationModel_D(DecayingPoolConcentrationModel):
             eol_ = '\n'
         else:
             eol_ = ''
+        if self.original_tagname_ is not None:
+            name_ = self.original_tagname_
         showIndent(outfile, level, pretty_print)
         outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
         already_processed = set()
         self.exportAttributes(outfile, level, already_processed, namespace_, name_='ConcentrationModel_D')
         if self.hasContent_():
             outfile.write('>%s' % (eol_, ))
-            self.exportChildren(outfile, level + 1, namespace_, name_, pretty_print=pretty_print)
+            self.exportChildren(outfile, level + 1, namespace_='', name_='ConcentrationModel_D', pretty_print=pretty_print)
             showIndent(outfile, level, pretty_print)
             outfile.write('</%s%s>%s' % (namespace_, name_, eol_))
         else:
@@ -11319,29 +9939,16 @@ class ConcentrationModel_D(DecayingPoolConcentrationModel):
         super(ConcentrationModel_D, self).exportAttributes(outfile, level, already_processed, namespace_, name_='ConcentrationModel_D')
         if self.type_ is not None and 'type_' not in already_processed:
             already_processed.add('type_')
-            outfile.write(' type=%s' % (self.gds_format_string(quote_attrib(self.type_).encode(ExternalEncoding), input_name='type'), ))
+            outfile.write(' type=%s' % (self.gds_encode(self.gds_format_string(quote_attrib(self.type_), input_name='type')), ))
     def exportChildren(self, outfile, level, namespace_='', name_='ConcentrationModel_D', fromsubclass_=False, pretty_print=True):
         super(ConcentrationModel_D, self).exportChildren(outfile, level, namespace_, name_, True, pretty_print=pretty_print)
-    def exportLiteral(self, outfile, level, name_='ConcentrationModel_D'):
-        level += 1
-        already_processed = set()
-        self.exportLiteralAttributes(outfile, level, already_processed, name_)
-        if self.hasContent_():
-            self.exportLiteralChildren(outfile, level, name_)
-    def exportLiteralAttributes(self, outfile, level, already_processed, name_):
-        if self.type_ is not None and 'type_' not in already_processed:
-            already_processed.add('type_')
-            showIndent(outfile, level)
-            outfile.write('type_="%s",\n' % (self.type_,))
-        super(ConcentrationModel_D, self).exportLiteralAttributes(outfile, level, already_processed, name_)
-    def exportLiteralChildren(self, outfile, level, name_):
-        super(ConcentrationModel_D, self).exportLiteralChildren(outfile, level, name_)
     def build(self, node):
         already_processed = set()
         self.buildAttributes(node, node.attrib, already_processed)
         for child in node:
             nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
             self.buildChildren(child, node, nodeName_)
+        return self
     def buildAttributes(self, node, attrs, already_processed):
         value = find_attr_value_('type', node)
         if value is not None and 'type' not in already_processed:
@@ -11361,13 +9968,19 @@ class Cell(BaseCell):
     to the id of the biophysicalProperties"""
     subclass = None
     superclass = BaseCell
-    def __init__(self, id=None, neuroLexId=None, name=None, metaid=None, notes=None, annotation=None, biophysicalProperties_attr=None, morphology_attr=None, morphology=None, biophysicalProperties=None):
+    def __init__(self, id=None, neuroLexId=None, name=None, metaid=None, notes=None, annotation=None, morphology_attr=None, biophysicalProperties_attr=None, morphology=None, biophysicalProperties=None):
+        self.original_tagname_ = None
         super(Cell, self).__init__(id, neuroLexId, name, metaid, notes, annotation, )
-        self.biophysicalProperties_attr = _cast(None, biophysicalProperties_attr)
         self.morphology_attr = _cast(None, morphology_attr)
+        self.biophysicalProperties_attr = _cast(None, biophysicalProperties_attr)
         self.morphology = morphology
         self.biophysicalProperties = biophysicalProperties
     def factory(*args_, **kwargs_):
+        if CurrentSubclassModule_ is not None:
+            subclass = getSubclassFromModule_(
+                CurrentSubclassModule_, Cell)
+            if subclass is not None:
+                return subclass(*args_, **kwargs_)
         if Cell.subclass:
             return Cell.subclass(*args_, **kwargs_)
         else:
@@ -11377,10 +9990,10 @@ class Cell(BaseCell):
     def set_morphology(self, morphology): self.morphology = morphology
     def get_biophysicalProperties(self): return self.biophysicalProperties
     def set_biophysicalProperties(self, biophysicalProperties): self.biophysicalProperties = biophysicalProperties
-    def get_biophysicalProperties_attr(self): return self.biophysicalProperties_attr
-    def set_biophysicalProperties_attr(self, biophysicalProperties_attr): self.biophysicalProperties_attr = biophysicalProperties_attr
     def get_morphology_attr(self): return self.morphology_attr
     def set_morphology_attr(self, morphology_attr): self.morphology_attr = morphology_attr
+    def get_biophysicalProperties_attr(self): return self.biophysicalProperties_attr
+    def set_biophysicalProperties_attr(self, biophysicalProperties_attr): self.biophysicalProperties_attr = biophysicalProperties_attr
     def hasContent_(self):
         if (
             self.morphology is not None or
@@ -11395,25 +10008,27 @@ class Cell(BaseCell):
             eol_ = '\n'
         else:
             eol_ = ''
+        if self.original_tagname_ is not None:
+            name_ = self.original_tagname_
         showIndent(outfile, level, pretty_print)
         outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
         already_processed = set()
         self.exportAttributes(outfile, level, already_processed, namespace_, name_='Cell')
         if self.hasContent_():
             outfile.write('>%s' % (eol_, ))
-            self.exportChildren(outfile, level + 1, namespace_, name_, pretty_print=pretty_print)
+            self.exportChildren(outfile, level + 1, namespace_='', name_='Cell', pretty_print=pretty_print)
             showIndent(outfile, level, pretty_print)
             outfile.write('</%s%s>%s' % (namespace_, name_, eol_))
         else:
             outfile.write('/>%s' % (eol_, ))
     def exportAttributes(self, outfile, level, already_processed, namespace_='', name_='Cell'):
         super(Cell, self).exportAttributes(outfile, level, already_processed, namespace_, name_='Cell')
-        if self.biophysicalProperties_attr is not None and 'biophysicalProperties_attr' not in already_processed:
-            already_processed.add('biophysicalProperties_attr')
-            outfile.write(' biophysicalProperties=%s' % (self.gds_format_string(quote_attrib(self.biophysicalProperties_attr).encode(ExternalEncoding), input_name='biophysicalProperties_attr'), ))
         if self.morphology_attr is not None and 'morphology_attr' not in already_processed:
             already_processed.add('morphology_attr')
-            outfile.write(' morphology=%s' % (self.gds_format_string(quote_attrib(self.morphology_attr).encode(ExternalEncoding), input_name='morphology_attr'), ))
+            outfile.write(' morphology=%s' % (self.gds_encode(self.gds_format_string(quote_attrib(self.morphology_attr), input_name='morphology_attr')), ))
+        if self.biophysicalProperties_attr is not None and 'biophysicalProperties_attr' not in already_processed:
+            already_processed.add('biophysicalProperties_attr')
+            outfile.write(' biophysicalProperties=%s' % (self.gds_encode(self.gds_format_string(quote_attrib(self.biophysicalProperties_attr), input_name='biophysicalProperties_attr')), ))
     def exportChildren(self, outfile, level, namespace_='', name_='Cell', fromsubclass_=False, pretty_print=True):
         super(Cell, self).exportChildren(outfile, level, namespace_, name_, True, pretty_print=pretty_print)
         if pretty_print:
@@ -11424,61 +10039,34 @@ class Cell(BaseCell):
             self.morphology.export(outfile, level, namespace_, name_='morphology', pretty_print=pretty_print)
         if self.biophysicalProperties is not None:
             self.biophysicalProperties.export(outfile, level, namespace_, name_='biophysicalProperties', pretty_print=pretty_print)
-    def exportLiteral(self, outfile, level, name_='Cell'):
-        level += 1
-        already_processed = set()
-        self.exportLiteralAttributes(outfile, level, already_processed, name_)
-        if self.hasContent_():
-            self.exportLiteralChildren(outfile, level, name_)
-    def exportLiteralAttributes(self, outfile, level, already_processed, name_):
-        if self.biophysicalProperties_attr is not None and 'biophysicalProperties_attr' not in already_processed:
-            already_processed.add('biophysicalProperties_attr')
-            showIndent(outfile, level)
-            outfile.write('biophysicalProperties_attr="%s",\n' % (self.biophysicalProperties_attr,))
-        if self.morphology_attr is not None and 'morphology_attr' not in already_processed:
-            already_processed.add('morphology_attr')
-            showIndent(outfile, level)
-            outfile.write('morphology_attr="%s",\n' % (self.morphology_attr,))
-        super(Cell, self).exportLiteralAttributes(outfile, level, already_processed, name_)
-    def exportLiteralChildren(self, outfile, level, name_):
-        super(Cell, self).exportLiteralChildren(outfile, level, name_)
-        if self.morphology is not None:
-            showIndent(outfile, level)
-            outfile.write('morphology=model_.Morphology(\n')
-            self.morphology.exportLiteral(outfile, level, name_='morphology')
-            showIndent(outfile, level)
-            outfile.write('),\n')
-        if self.biophysicalProperties is not None:
-            showIndent(outfile, level)
-            outfile.write('biophysicalProperties=model_.BiophysicalProperties(\n')
-            self.biophysicalProperties.exportLiteral(outfile, level, name_='biophysicalProperties')
-            showIndent(outfile, level)
-            outfile.write('),\n')
     def build(self, node):
         already_processed = set()
         self.buildAttributes(node, node.attrib, already_processed)
         for child in node:
             nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
             self.buildChildren(child, node, nodeName_)
+        return self
     def buildAttributes(self, node, attrs, already_processed):
-        value = find_attr_value_('biophysicalProperties', node)
-        if value is not None and 'biophysicalProperties_attr' not in already_processed:
-            already_processed.add('biophysicalProperties_attr')
-            self.biophysicalProperties_attr = value
         value = find_attr_value_('morphology', node)
         if value is not None and 'morphology_attr' not in already_processed:
             already_processed.add('morphology_attr')
             self.morphology_attr = value
+        value = find_attr_value_('biophysicalProperties', node)
+        if value is not None and 'biophysicalProperties_attr' not in already_processed:
+            already_processed.add('biophysicalProperties_attr')
+            self.biophysicalProperties_attr = value
         super(Cell, self).buildAttributes(node, attrs, already_processed)
     def buildChildren(self, child_, node, nodeName_, fromsubclass_=False):
         if nodeName_ == 'morphology':
             obj_ = Morphology.factory()
             obj_.build(child_)
-            self.set_morphology(obj_)
+            self.morphology = obj_
+            obj_.original_tagname_ = 'morphology'
         elif nodeName_ == 'biophysicalProperties':
             obj_ = BiophysicalProperties.factory()
             obj_.build(child_)
-            self.set_biophysicalProperties(obj_)
+            self.biophysicalProperties = obj_
+            obj_.original_tagname_ = 'biophysicalProperties'
         super(Cell, self).buildChildren(child_, node, nodeName_, True)
 # end class Cell
 
@@ -11486,63 +10074,88 @@ class Cell(BaseCell):
 class AdExIaFCell(BaseCell):
     subclass = None
     superclass = BaseCell
-    def __init__(self, id=None, neuroLexId=None, name=None, metaid=None, notes=None, annotation=None, reset=None, EL=None, C=None, b=None, refract=None, VT=None, delT=None, a=None, thresh=None, gL=None, tauw=None):
+    def __init__(self, id=None, neuroLexId=None, name=None, metaid=None, notes=None, annotation=None, C=None, gL=None, EL=None, reset=None, VT=None, thresh=None, delT=None, tauw=None, refract=None, a=None, b=None):
+        self.original_tagname_ = None
         super(AdExIaFCell, self).__init__(id, neuroLexId, name, metaid, notes, annotation, )
-        self.reset = _cast(None, reset)
-        self.EL = _cast(None, EL)
         self.C = _cast(None, C)
-        self.b = _cast(None, b)
-        self.refract = _cast(None, refract)
+        self.gL = _cast(None, gL)
+        self.EL = _cast(None, EL)
+        self.reset = _cast(None, reset)
         self.VT = _cast(None, VT)
-        self.delT = _cast(None, delT)
-        self.a = _cast(None, a)
         self.thresh = _cast(None, thresh)
-        self.gL = _cast(None, gL)
+        self.delT = _cast(None, delT)
         self.tauw = _cast(None, tauw)
-        pass
+        self.refract = _cast(None, refract)
+        self.a = _cast(None, a)
+        self.b = _cast(None, b)
     def factory(*args_, **kwargs_):
+        if CurrentSubclassModule_ is not None:
+            subclass = getSubclassFromModule_(
+                CurrentSubclassModule_, AdExIaFCell)
+            if subclass is not None:
+                return subclass(*args_, **kwargs_)
         if AdExIaFCell.subclass:
             return AdExIaFCell.subclass(*args_, **kwargs_)
         else:
             return AdExIaFCell(*args_, **kwargs_)
     factory = staticmethod(factory)
-    def get_reset(self): return self.reset
-    def set_reset(self, reset): self.reset = reset
-    def validate_Nml2Quantity_voltage(self, value):
-        # Validate type Nml2Quantity_voltage, a restriction on xs:string.
-        pass
-    def get_EL(self): return self.EL
-    def set_EL(self, EL): self.EL = EL
     def get_C(self): return self.C
     def set_C(self, C): self.C = C
-    def validate_Nml2Quantity_capacitance(self, value):
-        # Validate type Nml2Quantity_capacitance, a restriction on xs:string.
-        pass
-    def get_b(self): return self.b
-    def set_b(self, b): self.b = b
-    def validate_Nml2Quantity_current(self, value):
-        # Validate type Nml2Quantity_current, a restriction on xs:string.
-        pass
-    def get_refract(self): return self.refract
-    def set_refract(self, refract): self.refract = refract
-    def validate_Nml2Quantity_time(self, value):
-        # Validate type Nml2Quantity_time, a restriction on xs:string.
-        pass
+    def get_gL(self): return self.gL
+    def set_gL(self, gL): self.gL = gL
+    def get_EL(self): return self.EL
+    def set_EL(self, EL): self.EL = EL
+    def get_reset(self): return self.reset
+    def set_reset(self, reset): self.reset = reset
     def get_VT(self): return self.VT
     def set_VT(self, VT): self.VT = VT
+    def get_thresh(self): return self.thresh
+    def set_thresh(self, thresh): self.thresh = thresh
     def get_delT(self): return self.delT
     def set_delT(self, delT): self.delT = delT
+    def get_tauw(self): return self.tauw
+    def set_tauw(self, tauw): self.tauw = tauw
+    def get_refract(self): return self.refract
+    def set_refract(self, refract): self.refract = refract
     def get_a(self): return self.a
     def set_a(self, a): self.a = a
+    def get_b(self): return self.b
+    def set_b(self, b): self.b = b
+    def validate_Nml2Quantity_capacitance(self, value):
+        # Validate type Nml2Quantity_capacitance, a restriction on xs:string.
+        if value is not None and Validate_simpletypes_:
+            if not self.gds_validate_simple_patterns(
+                    self.validate_Nml2Quantity_capacitance_patterns_, value):
+                warnings_.warn('Value "%s" does not match xsd pattern restrictions: %s' % (value.encode('utf-8'), self.validate_Nml2Quantity_capacitance_patterns_, ))
+    validate_Nml2Quantity_capacitance_patterns_ = [['^-?([0-9]*(\\.[0-9]+)?)([eE]-?[0-9]+)?[\\s]*(F|uF|nF|pF)$']]
     def validate_Nml2Quantity_conductance(self, value):
         # Validate type Nml2Quantity_conductance, a restriction on xs:string.
-        pass
-    def get_thresh(self): return self.thresh
-    def set_thresh(self, thresh): self.thresh = thresh
-    def get_gL(self): return self.gL
-    def set_gL(self, gL): self.gL = gL
-    def get_tauw(self): return self.tauw
-    def set_tauw(self, tauw): self.tauw = tauw
+        if value is not None and Validate_simpletypes_:
+            if not self.gds_validate_simple_patterns(
+                    self.validate_Nml2Quantity_conductance_patterns_, value):
+                warnings_.warn('Value "%s" does not match xsd pattern restrictions: %s' % (value.encode('utf-8'), self.validate_Nml2Quantity_conductance_patterns_, ))
+    validate_Nml2Quantity_conductance_patterns_ = [['^-?([0-9]*(\\.[0-9]+)?)([eE]-?[0-9]+)?[\\s]*(S|mS|uS|nS|pS)$']]
+    def validate_Nml2Quantity_voltage(self, value):
+        # Validate type Nml2Quantity_voltage, a restriction on xs:string.
+        if value is not None and Validate_simpletypes_:
+            if not self.gds_validate_simple_patterns(
+                    self.validate_Nml2Quantity_voltage_patterns_, value):
+                warnings_.warn('Value "%s" does not match xsd pattern restrictions: %s' % (value.encode('utf-8'), self.validate_Nml2Quantity_voltage_patterns_, ))
+    validate_Nml2Quantity_voltage_patterns_ = [['^-?([0-9]*(\\.[0-9]+)?)([eE]-?[0-9]+)?[\\s]*(V|mV)$']]
+    def validate_Nml2Quantity_time(self, value):
+        # Validate type Nml2Quantity_time, a restriction on xs:string.
+        if value is not None and Validate_simpletypes_:
+            if not self.gds_validate_simple_patterns(
+                    self.validate_Nml2Quantity_time_patterns_, value):
+                warnings_.warn('Value "%s" does not match xsd pattern restrictions: %s' % (value.encode('utf-8'), self.validate_Nml2Quantity_time_patterns_, ))
+    validate_Nml2Quantity_time_patterns_ = [['^-?([0-9]*(\\.[0-9]+)?)([eE]-?[0-9]+)?[\\s]*(s|ms)$']]
+    def validate_Nml2Quantity_current(self, value):
+        # Validate type Nml2Quantity_current, a restriction on xs:string.
+        if value is not None and Validate_simpletypes_:
+            if not self.gds_validate_simple_patterns(
+                    self.validate_Nml2Quantity_current_patterns_, value):
+                warnings_.warn('Value "%s" does not match xsd pattern restrictions: %s' % (value.encode('utf-8'), self.validate_Nml2Quantity_current_patterns_, ))
+    validate_Nml2Quantity_current_patterns_ = [['^-?([0-9]*(\\.[0-9]+)?)([eE]-?[0-9]+)?[\\s]*(A|uA|nA|pA)$']]
     def hasContent_(self):
         if (
             super(AdExIaFCell, self).hasContent_()
@@ -11555,170 +10168,119 @@ class AdExIaFCell(BaseCell):
             eol_ = '\n'
         else:
             eol_ = ''
+        if self.original_tagname_ is not None:
+            name_ = self.original_tagname_
         showIndent(outfile, level, pretty_print)
         outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
         already_processed = set()
         self.exportAttributes(outfile, level, already_processed, namespace_, name_='AdExIaFCell')
         if self.hasContent_():
             outfile.write('>%s' % (eol_, ))
-            self.exportChildren(outfile, level + 1, namespace_, name_, pretty_print=pretty_print)
+            self.exportChildren(outfile, level + 1, namespace_='', name_='AdExIaFCell', pretty_print=pretty_print)
             showIndent(outfile, level, pretty_print)
             outfile.write('</%s%s>%s' % (namespace_, name_, eol_))
         else:
             outfile.write('/>%s' % (eol_, ))
     def exportAttributes(self, outfile, level, already_processed, namespace_='', name_='AdExIaFCell'):
         super(AdExIaFCell, self).exportAttributes(outfile, level, already_processed, namespace_, name_='AdExIaFCell')
-        if self.reset is not None and 'reset' not in already_processed:
-            already_processed.add('reset')
-            outfile.write(' reset=%s' % (quote_attrib(self.reset), ))
-        if self.EL is not None and 'EL' not in already_processed:
-            already_processed.add('EL')
-            outfile.write(' EL=%s' % (quote_attrib(self.EL), ))
         if self.C is not None and 'C' not in already_processed:
             already_processed.add('C')
             outfile.write(' C=%s' % (quote_attrib(self.C), ))
-        if self.b is not None and 'b' not in already_processed:
-            already_processed.add('b')
-            outfile.write(' b=%s' % (quote_attrib(self.b), ))
-        if self.refract is not None and 'refract' not in already_processed:
-            already_processed.add('refract')
-            outfile.write(' refract=%s' % (quote_attrib(self.refract), ))
+        if self.gL is not None and 'gL' not in already_processed:
+            already_processed.add('gL')
+            outfile.write(' gL=%s' % (quote_attrib(self.gL), ))
+        if self.EL is not None and 'EL' not in already_processed:
+            already_processed.add('EL')
+            outfile.write(' EL=%s' % (quote_attrib(self.EL), ))
+        if self.reset is not None and 'reset' not in already_processed:
+            already_processed.add('reset')
+            outfile.write(' reset=%s' % (quote_attrib(self.reset), ))
         if self.VT is not None and 'VT' not in already_processed:
             already_processed.add('VT')
             outfile.write(' VT=%s' % (quote_attrib(self.VT), ))
-        if self.delT is not None and 'delT' not in already_processed:
-            already_processed.add('delT')
-            outfile.write(' delT=%s' % (quote_attrib(self.delT), ))
-        if self.a is not None and 'a' not in already_processed:
-            already_processed.add('a')
-            outfile.write(' a=%s' % (quote_attrib(self.a), ))
         if self.thresh is not None and 'thresh' not in already_processed:
             already_processed.add('thresh')
             outfile.write(' thresh=%s' % (quote_attrib(self.thresh), ))
-        if self.gL is not None and 'gL' not in already_processed:
-            already_processed.add('gL')
-            outfile.write(' gL=%s' % (quote_attrib(self.gL), ))
+        if self.delT is not None and 'delT' not in already_processed:
+            already_processed.add('delT')
+            outfile.write(' delT=%s' % (quote_attrib(self.delT), ))
         if self.tauw is not None and 'tauw' not in already_processed:
             already_processed.add('tauw')
             outfile.write(' tauw=%s' % (quote_attrib(self.tauw), ))
-    def exportChildren(self, outfile, level, namespace_='', name_='AdExIaFCell', fromsubclass_=False, pretty_print=True):
-        super(AdExIaFCell, self).exportChildren(outfile, level, namespace_, name_, True, pretty_print=pretty_print)
-    def exportLiteral(self, outfile, level, name_='AdExIaFCell'):
-        level += 1
-        already_processed = set()
-        self.exportLiteralAttributes(outfile, level, already_processed, name_)
-        if self.hasContent_():
-            self.exportLiteralChildren(outfile, level, name_)
-    def exportLiteralAttributes(self, outfile, level, already_processed, name_):
-        if self.reset is not None and 'reset' not in already_processed:
-            already_processed.add('reset')
-            showIndent(outfile, level)
-            outfile.write('reset="%s",\n' % (self.reset,))
-        if self.EL is not None and 'EL' not in already_processed:
-            already_processed.add('EL')
-            showIndent(outfile, level)
-            outfile.write('EL="%s",\n' % (self.EL,))
-        if self.C is not None and 'C' not in already_processed:
-            already_processed.add('C')
-            showIndent(outfile, level)
-            outfile.write('C="%s",\n' % (self.C,))
-        if self.b is not None and 'b' not in already_processed:
-            already_processed.add('b')
-            showIndent(outfile, level)
-            outfile.write('b="%s",\n' % (self.b,))
         if self.refract is not None and 'refract' not in already_processed:
             already_processed.add('refract')
-            showIndent(outfile, level)
-            outfile.write('refract="%s",\n' % (self.refract,))
-        if self.VT is not None and 'VT' not in already_processed:
-            already_processed.add('VT')
-            showIndent(outfile, level)
-            outfile.write('VT="%s",\n' % (self.VT,))
-        if self.delT is not None and 'delT' not in already_processed:
-            already_processed.add('delT')
-            showIndent(outfile, level)
-            outfile.write('delT="%s",\n' % (self.delT,))
+            outfile.write(' refract=%s' % (quote_attrib(self.refract), ))
         if self.a is not None and 'a' not in already_processed:
             already_processed.add('a')
-            showIndent(outfile, level)
-            outfile.write('a="%s",\n' % (self.a,))
-        if self.thresh is not None and 'thresh' not in already_processed:
-            already_processed.add('thresh')
-            showIndent(outfile, level)
-            outfile.write('thresh="%s",\n' % (self.thresh,))
-        if self.gL is not None and 'gL' not in already_processed:
-            already_processed.add('gL')
-            showIndent(outfile, level)
-            outfile.write('gL="%s",\n' % (self.gL,))
-        if self.tauw is not None and 'tauw' not in already_processed:
-            already_processed.add('tauw')
-            showIndent(outfile, level)
-            outfile.write('tauw="%s",\n' % (self.tauw,))
-        super(AdExIaFCell, self).exportLiteralAttributes(outfile, level, already_processed, name_)
-    def exportLiteralChildren(self, outfile, level, name_):
-        super(AdExIaFCell, self).exportLiteralChildren(outfile, level, name_)
+            outfile.write(' a=%s' % (quote_attrib(self.a), ))
+        if self.b is not None and 'b' not in already_processed:
+            already_processed.add('b')
+            outfile.write(' b=%s' % (quote_attrib(self.b), ))
+    def exportChildren(self, outfile, level, namespace_='', name_='AdExIaFCell', fromsubclass_=False, pretty_print=True):
+        super(AdExIaFCell, self).exportChildren(outfile, level, namespace_, name_, True, pretty_print=pretty_print)
     def build(self, node):
         already_processed = set()
         self.buildAttributes(node, node.attrib, already_processed)
         for child in node:
             nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
             self.buildChildren(child, node, nodeName_)
+        return self
     def buildAttributes(self, node, attrs, already_processed):
-        value = find_attr_value_('reset', node)
-        if value is not None and 'reset' not in already_processed:
-            already_processed.add('reset')
-            self.reset = value
-            self.validate_Nml2Quantity_voltage(self.reset)    # validate type Nml2Quantity_voltage
-        value = find_attr_value_('EL', node)
-        if value is not None and 'EL' not in already_processed:
-            already_processed.add('EL')
-            self.EL = value
-            self.validate_Nml2Quantity_voltage(self.EL)    # validate type Nml2Quantity_voltage
         value = find_attr_value_('C', node)
         if value is not None and 'C' not in already_processed:
             already_processed.add('C')
             self.C = value
             self.validate_Nml2Quantity_capacitance(self.C)    # validate type Nml2Quantity_capacitance
-        value = find_attr_value_('b', node)
-        if value is not None and 'b' not in already_processed:
-            already_processed.add('b')
-            self.b = value
-            self.validate_Nml2Quantity_current(self.b)    # validate type Nml2Quantity_current
-        value = find_attr_value_('refract', node)
-        if value is not None and 'refract' not in already_processed:
-            already_processed.add('refract')
-            self.refract = value
-            self.validate_Nml2Quantity_time(self.refract)    # validate type Nml2Quantity_time
+        value = find_attr_value_('gL', node)
+        if value is not None and 'gL' not in already_processed:
+            already_processed.add('gL')
+            self.gL = value
+            self.validate_Nml2Quantity_conductance(self.gL)    # validate type Nml2Quantity_conductance
+        value = find_attr_value_('EL', node)
+        if value is not None and 'EL' not in already_processed:
+            already_processed.add('EL')
+            self.EL = value
+            self.validate_Nml2Quantity_voltage(self.EL)    # validate type Nml2Quantity_voltage
+        value = find_attr_value_('reset', node)
+        if value is not None and 'reset' not in already_processed:
+            already_processed.add('reset')
+            self.reset = value
+            self.validate_Nml2Quantity_voltage(self.reset)    # validate type Nml2Quantity_voltage
         value = find_attr_value_('VT', node)
         if value is not None and 'VT' not in already_processed:
             already_processed.add('VT')
             self.VT = value
             self.validate_Nml2Quantity_voltage(self.VT)    # validate type Nml2Quantity_voltage
-        value = find_attr_value_('delT', node)
-        if value is not None and 'delT' not in already_processed:
-            already_processed.add('delT')
-            self.delT = value
-            self.validate_Nml2Quantity_voltage(self.delT)    # validate type Nml2Quantity_voltage
-        value = find_attr_value_('a', node)
-        if value is not None and 'a' not in already_processed:
-            already_processed.add('a')
-            self.a = value
-            self.validate_Nml2Quantity_conductance(self.a)    # validate type Nml2Quantity_conductance
         value = find_attr_value_('thresh', node)
         if value is not None and 'thresh' not in already_processed:
             already_processed.add('thresh')
             self.thresh = value
             self.validate_Nml2Quantity_voltage(self.thresh)    # validate type Nml2Quantity_voltage
-        value = find_attr_value_('gL', node)
-        if value is not None and 'gL' not in already_processed:
-            already_processed.add('gL')
-            self.gL = value
-            self.validate_Nml2Quantity_conductance(self.gL)    # validate type Nml2Quantity_conductance
+        value = find_attr_value_('delT', node)
+        if value is not None and 'delT' not in already_processed:
+            already_processed.add('delT')
+            self.delT = value
+            self.validate_Nml2Quantity_voltage(self.delT)    # validate type Nml2Quantity_voltage
         value = find_attr_value_('tauw', node)
         if value is not None and 'tauw' not in already_processed:
             already_processed.add('tauw')
             self.tauw = value
             self.validate_Nml2Quantity_time(self.tauw)    # validate type Nml2Quantity_time
+        value = find_attr_value_('refract', node)
+        if value is not None and 'refract' not in already_processed:
+            already_processed.add('refract')
+            self.refract = value
+            self.validate_Nml2Quantity_time(self.refract)    # validate type Nml2Quantity_time
+        value = find_attr_value_('a', node)
+        if value is not None and 'a' not in already_processed:
+            already_processed.add('a')
+            self.a = value
+            self.validate_Nml2Quantity_conductance(self.a)    # validate type Nml2Quantity_conductance
+        value = find_attr_value_('b', node)
+        if value is not None and 'b' not in already_processed:
+            already_processed.add('b')
+            self.b = value
+            self.validate_Nml2Quantity_current(self.b)    # validate type Nml2Quantity_current
         super(AdExIaFCell, self).buildAttributes(node, attrs, already_processed)
     def buildChildren(self, child_, node, nodeName_, fromsubclass_=False):
         super(AdExIaFCell, self).buildChildren(child_, node, nodeName_, True)
@@ -11729,39 +10291,52 @@ class AdExIaFCell(BaseCell):
 class IzhikevichCell(BaseCell):
     subclass = None
     superclass = BaseCell
-    def __init__(self, id=None, neuroLexId=None, name=None, metaid=None, notes=None, annotation=None, a=None, c=None, b=None, d=None, v0=None, thresh=None):
+    def __init__(self, id=None, neuroLexId=None, name=None, metaid=None, notes=None, annotation=None, v0=None, thresh=None, a=None, b=None, c=None, d=None):
+        self.original_tagname_ = None
         super(IzhikevichCell, self).__init__(id, neuroLexId, name, metaid, notes, annotation, )
+        self.v0 = _cast(None, v0)
+        self.thresh = _cast(None, thresh)
         self.a = _cast(None, a)
-        self.c = _cast(None, c)
         self.b = _cast(None, b)
+        self.c = _cast(None, c)
         self.d = _cast(None, d)
-        self.v0 = _cast(None, v0)
-        self.thresh = _cast(None, thresh)
-        pass
     def factory(*args_, **kwargs_):
+        if CurrentSubclassModule_ is not None:
+            subclass = getSubclassFromModule_(
+                CurrentSubclassModule_, IzhikevichCell)
+            if subclass is not None:
+                return subclass(*args_, **kwargs_)
         if IzhikevichCell.subclass:
             return IzhikevichCell.subclass(*args_, **kwargs_)
         else:
             return IzhikevichCell(*args_, **kwargs_)
     factory = staticmethod(factory)
+    def get_v0(self): return self.v0
+    def set_v0(self, v0): self.v0 = v0
+    def get_thresh(self): return self.thresh
+    def set_thresh(self, thresh): self.thresh = thresh
     def get_a(self): return self.a
     def set_a(self, a): self.a = a
-    def validate_Nml2Quantity_none(self, value):
-        # Validate type Nml2Quantity_none, a restriction on xs:string.
-        pass
-    def get_c(self): return self.c
-    def set_c(self, c): self.c = c
     def get_b(self): return self.b
     def set_b(self, b): self.b = b
+    def get_c(self): return self.c
+    def set_c(self, c): self.c = c
     def get_d(self): return self.d
     def set_d(self, d): self.d = d
-    def get_v0(self): return self.v0
-    def set_v0(self, v0): self.v0 = v0
     def validate_Nml2Quantity_voltage(self, value):
         # Validate type Nml2Quantity_voltage, a restriction on xs:string.
-        pass
-    def get_thresh(self): return self.thresh
-    def set_thresh(self, thresh): self.thresh = thresh
+        if value is not None and Validate_simpletypes_:
+            if not self.gds_validate_simple_patterns(
+                    self.validate_Nml2Quantity_voltage_patterns_, value):
+                warnings_.warn('Value "%s" does not match xsd pattern restrictions: %s' % (value.encode('utf-8'), self.validate_Nml2Quantity_voltage_patterns_, ))
+    validate_Nml2Quantity_voltage_patterns_ = [['^-?([0-9]*(\\.[0-9]+)?)([eE]-?[0-9]+)?[\\s]*(V|mV)$']]
+    def validate_Nml2Quantity_none(self, value):
+        # Validate type Nml2Quantity_none, a restriction on xs:string.
+        if value is not None and Validate_simpletypes_:
+            if not self.gds_validate_simple_patterns(
+                    self.validate_Nml2Quantity_none_patterns_, value):
+                warnings_.warn('Value "%s" does not match xsd pattern restrictions: %s' % (value.encode('utf-8'), self.validate_Nml2Quantity_none_patterns_, ))
+    validate_Nml2Quantity_none_patterns_ = [['^-?([0-9]*(\\.[0-9]+)?)([eE]-?[0-9]+)?$']]
     def hasContent_(self):
         if (
             super(IzhikevichCell, self).hasContent_()
@@ -11774,110 +10349,79 @@ class IzhikevichCell(BaseCell):
             eol_ = '\n'
         else:
             eol_ = ''
+        if self.original_tagname_ is not None:
+            name_ = self.original_tagname_
         showIndent(outfile, level, pretty_print)
         outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
         already_processed = set()
         self.exportAttributes(outfile, level, already_processed, namespace_, name_='IzhikevichCell')
         if self.hasContent_():
             outfile.write('>%s' % (eol_, ))
-            self.exportChildren(outfile, level + 1, namespace_, name_, pretty_print=pretty_print)
+            self.exportChildren(outfile, level + 1, namespace_='', name_='IzhikevichCell', pretty_print=pretty_print)
             showIndent(outfile, level, pretty_print)
             outfile.write('</%s%s>%s' % (namespace_, name_, eol_))
         else:
             outfile.write('/>%s' % (eol_, ))
     def exportAttributes(self, outfile, level, already_processed, namespace_='', name_='IzhikevichCell'):
         super(IzhikevichCell, self).exportAttributes(outfile, level, already_processed, namespace_, name_='IzhikevichCell')
-        if self.a is not None and 'a' not in already_processed:
-            already_processed.add('a')
-            outfile.write(' a=%s' % (quote_attrib(self.a), ))
-        if self.c is not None and 'c' not in already_processed:
-            already_processed.add('c')
-            outfile.write(' c=%s' % (quote_attrib(self.c), ))
-        if self.b is not None and 'b' not in already_processed:
-            already_processed.add('b')
-            outfile.write(' b=%s' % (quote_attrib(self.b), ))
-        if self.d is not None and 'd' not in already_processed:
-            already_processed.add('d')
-            outfile.write(' d=%s' % (quote_attrib(self.d), ))
         if self.v0 is not None and 'v0' not in already_processed:
             already_processed.add('v0')
             outfile.write(' v0=%s' % (quote_attrib(self.v0), ))
         if self.thresh is not None and 'thresh' not in already_processed:
             already_processed.add('thresh')
             outfile.write(' thresh=%s' % (quote_attrib(self.thresh), ))
-    def exportChildren(self, outfile, level, namespace_='', name_='IzhikevichCell', fromsubclass_=False, pretty_print=True):
-        super(IzhikevichCell, self).exportChildren(outfile, level, namespace_, name_, True, pretty_print=pretty_print)
-    def exportLiteral(self, outfile, level, name_='IzhikevichCell'):
-        level += 1
-        already_processed = set()
-        self.exportLiteralAttributes(outfile, level, already_processed, name_)
-        if self.hasContent_():
-            self.exportLiteralChildren(outfile, level, name_)
-    def exportLiteralAttributes(self, outfile, level, already_processed, name_):
         if self.a is not None and 'a' not in already_processed:
             already_processed.add('a')
-            showIndent(outfile, level)
-            outfile.write('a="%s",\n' % (self.a,))
-        if self.c is not None and 'c' not in already_processed:
-            already_processed.add('c')
-            showIndent(outfile, level)
-            outfile.write('c="%s",\n' % (self.c,))
+            outfile.write(' a=%s' % (quote_attrib(self.a), ))
         if self.b is not None and 'b' not in already_processed:
             already_processed.add('b')
-            showIndent(outfile, level)
-            outfile.write('b="%s",\n' % (self.b,))
+            outfile.write(' b=%s' % (quote_attrib(self.b), ))
+        if self.c is not None and 'c' not in already_processed:
+            already_processed.add('c')
+            outfile.write(' c=%s' % (quote_attrib(self.c), ))
         if self.d is not None and 'd' not in already_processed:
             already_processed.add('d')
-            showIndent(outfile, level)
-            outfile.write('d="%s",\n' % (self.d,))
-        if self.v0 is not None and 'v0' not in already_processed:
-            already_processed.add('v0')
-            showIndent(outfile, level)
-            outfile.write('v0="%s",\n' % (self.v0,))
-        if self.thresh is not None and 'thresh' not in already_processed:
-            already_processed.add('thresh')
-            showIndent(outfile, level)
-            outfile.write('thresh="%s",\n' % (self.thresh,))
-        super(IzhikevichCell, self).exportLiteralAttributes(outfile, level, already_processed, name_)
-    def exportLiteralChildren(self, outfile, level, name_):
-        super(IzhikevichCell, self).exportLiteralChildren(outfile, level, name_)
+            outfile.write(' d=%s' % (quote_attrib(self.d), ))
+    def exportChildren(self, outfile, level, namespace_='', name_='IzhikevichCell', fromsubclass_=False, pretty_print=True):
+        super(IzhikevichCell, self).exportChildren(outfile, level, namespace_, name_, True, pretty_print=pretty_print)
     def build(self, node):
         already_processed = set()
         self.buildAttributes(node, node.attrib, already_processed)
         for child in node:
             nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
             self.buildChildren(child, node, nodeName_)
+        return self
     def buildAttributes(self, node, attrs, already_processed):
+        value = find_attr_value_('v0', node)
+        if value is not None and 'v0' not in already_processed:
+            already_processed.add('v0')
+            self.v0 = value
+            self.validate_Nml2Quantity_voltage(self.v0)    # validate type Nml2Quantity_voltage
+        value = find_attr_value_('thresh', node)
+        if value is not None and 'thresh' not in already_processed:
+            already_processed.add('thresh')
+            self.thresh = value
+            self.validate_Nml2Quantity_voltage(self.thresh)    # validate type Nml2Quantity_voltage
         value = find_attr_value_('a', node)
         if value is not None and 'a' not in already_processed:
             already_processed.add('a')
             self.a = value
             self.validate_Nml2Quantity_none(self.a)    # validate type Nml2Quantity_none
-        value = find_attr_value_('c', node)
-        if value is not None and 'c' not in already_processed:
-            already_processed.add('c')
-            self.c = value
-            self.validate_Nml2Quantity_none(self.c)    # validate type Nml2Quantity_none
         value = find_attr_value_('b', node)
         if value is not None and 'b' not in already_processed:
             already_processed.add('b')
             self.b = value
             self.validate_Nml2Quantity_none(self.b)    # validate type Nml2Quantity_none
+        value = find_attr_value_('c', node)
+        if value is not None and 'c' not in already_processed:
+            already_processed.add('c')
+            self.c = value
+            self.validate_Nml2Quantity_none(self.c)    # validate type Nml2Quantity_none
         value = find_attr_value_('d', node)
         if value is not None and 'd' not in already_processed:
             already_processed.add('d')
             self.d = value
             self.validate_Nml2Quantity_none(self.d)    # validate type Nml2Quantity_none
-        value = find_attr_value_('v0', node)
-        if value is not None and 'v0' not in already_processed:
-            already_processed.add('v0')
-            self.v0 = value
-            self.validate_Nml2Quantity_voltage(self.v0)    # validate type Nml2Quantity_voltage
-        value = find_attr_value_('thresh', node)
-        if value is not None and 'thresh' not in already_processed:
-            already_processed.add('thresh')
-            self.thresh = value
-            self.validate_Nml2Quantity_voltage(self.thresh)    # validate type Nml2Quantity_voltage
         super(IzhikevichCell, self).buildAttributes(node, attrs, already_processed)
     def buildChildren(self, child_, node, nodeName_, fromsubclass_=False):
         super(IzhikevichCell, self).buildChildren(child_, node, nodeName_, True)
@@ -11888,41 +10432,59 @@ class IzhikevichCell(BaseCell):
 class IaFCell(BaseCell):
     subclass = None
     superclass = BaseCell
-    def __init__(self, id=None, neuroLexId=None, name=None, metaid=None, notes=None, annotation=None, reset=None, C=None, thresh=None, leakConductance=None, leakReversal=None, extensiontype_=None):
+    def __init__(self, id=None, neuroLexId=None, name=None, metaid=None, notes=None, annotation=None, leakReversal=None, thresh=None, reset=None, C=None, leakConductance=None, extensiontype_=None):
+        self.original_tagname_ = None
         super(IaFCell, self).__init__(id, neuroLexId, name, metaid, notes, annotation, extensiontype_, )
+        self.leakReversal = _cast(None, leakReversal)
+        self.thresh = _cast(None, thresh)
         self.reset = _cast(None, reset)
         self.C = _cast(None, C)
-        self.thresh = _cast(None, thresh)
         self.leakConductance = _cast(None, leakConductance)
-        self.leakReversal = _cast(None, leakReversal)
         self.extensiontype_ = extensiontype_
     def factory(*args_, **kwargs_):
+        if CurrentSubclassModule_ is not None:
+            subclass = getSubclassFromModule_(
+                CurrentSubclassModule_, IaFCell)
+            if subclass is not None:
+                return subclass(*args_, **kwargs_)
         if IaFCell.subclass:
             return IaFCell.subclass(*args_, **kwargs_)
         else:
             return IaFCell(*args_, **kwargs_)
     factory = staticmethod(factory)
-    def get_reset(self): return self.reset
-    def set_reset(self, reset): self.reset = reset
-    def validate_Nml2Quantity_voltage(self, value):
-        # Validate type Nml2Quantity_voltage, a restriction on xs:string.
-        pass
-    def get_C(self): return self.C
-    def set_C(self, C): self.C = C
-    def validate_Nml2Quantity_capacitance(self, value):
-        # Validate type Nml2Quantity_capacitance, a restriction on xs:string.
-        pass
+    def get_leakReversal(self): return self.leakReversal
+    def set_leakReversal(self, leakReversal): self.leakReversal = leakReversal
     def get_thresh(self): return self.thresh
     def set_thresh(self, thresh): self.thresh = thresh
+    def get_reset(self): return self.reset
+    def set_reset(self, reset): self.reset = reset
+    def get_C(self): return self.C
+    def set_C(self, C): self.C = C
     def get_leakConductance(self): return self.leakConductance
     def set_leakConductance(self, leakConductance): self.leakConductance = leakConductance
-    def validate_Nml2Quantity_conductance(self, value):
-        # Validate type Nml2Quantity_conductance, a restriction on xs:string.
-        pass
-    def get_leakReversal(self): return self.leakReversal
-    def set_leakReversal(self, leakReversal): self.leakReversal = leakReversal
     def get_extensiontype_(self): return self.extensiontype_
     def set_extensiontype_(self, extensiontype_): self.extensiontype_ = extensiontype_
+    def validate_Nml2Quantity_voltage(self, value):
+        # Validate type Nml2Quantity_voltage, a restriction on xs:string.
+        if value is not None and Validate_simpletypes_:
+            if not self.gds_validate_simple_patterns(
+                    self.validate_Nml2Quantity_voltage_patterns_, value):
+                warnings_.warn('Value "%s" does not match xsd pattern restrictions: %s' % (value.encode('utf-8'), self.validate_Nml2Quantity_voltage_patterns_, ))
+    validate_Nml2Quantity_voltage_patterns_ = [['^-?([0-9]*(\\.[0-9]+)?)([eE]-?[0-9]+)?[\\s]*(V|mV)$']]
+    def validate_Nml2Quantity_capacitance(self, value):
+        # Validate type Nml2Quantity_capacitance, a restriction on xs:string.
+        if value is not None and Validate_simpletypes_:
+            if not self.gds_validate_simple_patterns(
+                    self.validate_Nml2Quantity_capacitance_patterns_, value):
+                warnings_.warn('Value "%s" does not match xsd pattern restrictions: %s' % (value.encode('utf-8'), self.validate_Nml2Quantity_capacitance_patterns_, ))
+    validate_Nml2Quantity_capacitance_patterns_ = [['^-?([0-9]*(\\.[0-9]+)?)([eE]-?[0-9]+)?[\\s]*(F|uF|nF|pF)$']]
+    def validate_Nml2Quantity_conductance(self, value):
+        # Validate type Nml2Quantity_conductance, a restriction on xs:string.
+        if value is not None and Validate_simpletypes_:
+            if not self.gds_validate_simple_patterns(
+                    self.validate_Nml2Quantity_conductance_patterns_, value):
+                warnings_.warn('Value "%s" does not match xsd pattern restrictions: %s' % (value.encode('utf-8'), self.validate_Nml2Quantity_conductance_patterns_, ))
+    validate_Nml2Quantity_conductance_patterns_ = [['^-?([0-9]*(\\.[0-9]+)?)([eE]-?[0-9]+)?[\\s]*(S|mS|uS|nS|pS)$']]
     def hasContent_(self):
         if (
             super(IaFCell, self).hasContent_()
@@ -11935,77 +10497,60 @@ class IaFCell(BaseCell):
             eol_ = '\n'
         else:
             eol_ = ''
+        if self.original_tagname_ is not None:
+            name_ = self.original_tagname_
         showIndent(outfile, level, pretty_print)
         outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
         already_processed = set()
         self.exportAttributes(outfile, level, already_processed, namespace_, name_='IaFCell')
         if self.hasContent_():
             outfile.write('>%s' % (eol_, ))
-            self.exportChildren(outfile, level + 1, namespace_, name_, pretty_print=pretty_print)
+            self.exportChildren(outfile, level + 1, namespace_='', name_='IaFCell', pretty_print=pretty_print)
             showIndent(outfile, level, pretty_print)
             outfile.write('</%s%s>%s' % (namespace_, name_, eol_))
         else:
             outfile.write('/>%s' % (eol_, ))
     def exportAttributes(self, outfile, level, already_processed, namespace_='', name_='IaFCell'):
         super(IaFCell, self).exportAttributes(outfile, level, already_processed, namespace_, name_='IaFCell')
+        if self.leakReversal is not None and 'leakReversal' not in already_processed:
+            already_processed.add('leakReversal')
+            outfile.write(' leakReversal=%s' % (quote_attrib(self.leakReversal), ))
+        if self.thresh is not None and 'thresh' not in already_processed:
+            already_processed.add('thresh')
+            outfile.write(' thresh=%s' % (quote_attrib(self.thresh), ))
         if self.reset is not None and 'reset' not in already_processed:
             already_processed.add('reset')
             outfile.write(' reset=%s' % (quote_attrib(self.reset), ))
         if self.C is not None and 'C' not in already_processed:
             already_processed.add('C')
             outfile.write(' C=%s' % (quote_attrib(self.C), ))
-        if self.thresh is not None and 'thresh' not in already_processed:
-            already_processed.add('thresh')
-            outfile.write(' thresh=%s' % (quote_attrib(self.thresh), ))
         if self.leakConductance is not None and 'leakConductance' not in already_processed:
             already_processed.add('leakConductance')
             outfile.write(' leakConductance=%s' % (quote_attrib(self.leakConductance), ))
-        if self.leakReversal is not None and 'leakReversal' not in already_processed:
-            already_processed.add('leakReversal')
-            outfile.write(' leakReversal=%s' % (quote_attrib(self.leakReversal), ))
         if self.extensiontype_ is not None and 'xsi:type' not in already_processed:
             already_processed.add('xsi:type')
             outfile.write(' xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"')
             outfile.write(' xsi:type="%s"' % self.extensiontype_)
     def exportChildren(self, outfile, level, namespace_='', name_='IaFCell', fromsubclass_=False, pretty_print=True):
         super(IaFCell, self).exportChildren(outfile, level, namespace_, name_, True, pretty_print=pretty_print)
-    def exportLiteral(self, outfile, level, name_='IaFCell'):
-        level += 1
-        already_processed = set()
-        self.exportLiteralAttributes(outfile, level, already_processed, name_)
-        if self.hasContent_():
-            self.exportLiteralChildren(outfile, level, name_)
-    def exportLiteralAttributes(self, outfile, level, already_processed, name_):
-        if self.reset is not None and 'reset' not in already_processed:
-            already_processed.add('reset')
-            showIndent(outfile, level)
-            outfile.write('reset="%s",\n' % (self.reset,))
-        if self.C is not None and 'C' not in already_processed:
-            already_processed.add('C')
-            showIndent(outfile, level)
-            outfile.write('C="%s",\n' % (self.C,))
-        if self.thresh is not None and 'thresh' not in already_processed:
-            already_processed.add('thresh')
-            showIndent(outfile, level)
-            outfile.write('thresh="%s",\n' % (self.thresh,))
-        if self.leakConductance is not None and 'leakConductance' not in already_processed:
-            already_processed.add('leakConductance')
-            showIndent(outfile, level)
-            outfile.write('leakConductance="%s",\n' % (self.leakConductance,))
-        if self.leakReversal is not None and 'leakReversal' not in already_processed:
-            already_processed.add('leakReversal')
-            showIndent(outfile, level)
-            outfile.write('leakReversal="%s",\n' % (self.leakReversal,))
-        super(IaFCell, self).exportLiteralAttributes(outfile, level, already_processed, name_)
-    def exportLiteralChildren(self, outfile, level, name_):
-        super(IaFCell, self).exportLiteralChildren(outfile, level, name_)
     def build(self, node):
         already_processed = set()
         self.buildAttributes(node, node.attrib, already_processed)
         for child in node:
             nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
             self.buildChildren(child, node, nodeName_)
+        return self
     def buildAttributes(self, node, attrs, already_processed):
+        value = find_attr_value_('leakReversal', node)
+        if value is not None and 'leakReversal' not in already_processed:
+            already_processed.add('leakReversal')
+            self.leakReversal = value
+            self.validate_Nml2Quantity_voltage(self.leakReversal)    # validate type Nml2Quantity_voltage
+        value = find_attr_value_('thresh', node)
+        if value is not None and 'thresh' not in already_processed:
+            already_processed.add('thresh')
+            self.thresh = value
+            self.validate_Nml2Quantity_voltage(self.thresh)    # validate type Nml2Quantity_voltage
         value = find_attr_value_('reset', node)
         if value is not None and 'reset' not in already_processed:
             already_processed.add('reset')
@@ -12016,21 +10561,11 @@ class IaFCell(BaseCell):
             already_processed.add('C')
             self.C = value
             self.validate_Nml2Quantity_capacitance(self.C)    # validate type Nml2Quantity_capacitance
-        value = find_attr_value_('thresh', node)
-        if value is not None and 'thresh' not in already_processed:
-            already_processed.add('thresh')
-            self.thresh = value
-            self.validate_Nml2Quantity_voltage(self.thresh)    # validate type Nml2Quantity_voltage
         value = find_attr_value_('leakConductance', node)
         if value is not None and 'leakConductance' not in already_processed:
             already_processed.add('leakConductance')
             self.leakConductance = value
             self.validate_Nml2Quantity_conductance(self.leakConductance)    # validate type Nml2Quantity_conductance
-        value = find_attr_value_('leakReversal', node)
-        if value is not None and 'leakReversal' not in already_processed:
-            already_processed.add('leakReversal')
-            self.leakReversal = value
-            self.validate_Nml2Quantity_voltage(self.leakReversal)    # validate type Nml2Quantity_voltage
         value = find_attr_value_('xsi:type', node)
         if value is not None and 'xsi:type' not in already_processed:
             already_processed.add('xsi:type')
@@ -12045,35 +10580,49 @@ class IaFCell(BaseCell):
 class IaFTauCell(BaseCell):
     subclass = None
     superclass = BaseCell
-    def __init__(self, id=None, neuroLexId=None, name=None, metaid=None, notes=None, annotation=None, reset=None, tau=None, thresh=None, leakReversal=None, extensiontype_=None):
+    def __init__(self, id=None, neuroLexId=None, name=None, metaid=None, notes=None, annotation=None, leakReversal=None, thresh=None, reset=None, tau=None, extensiontype_=None):
+        self.original_tagname_ = None
         super(IaFTauCell, self).__init__(id, neuroLexId, name, metaid, notes, annotation, extensiontype_, )
+        self.leakReversal = _cast(None, leakReversal)
+        self.thresh = _cast(None, thresh)
         self.reset = _cast(None, reset)
         self.tau = _cast(None, tau)
-        self.thresh = _cast(None, thresh)
-        self.leakReversal = _cast(None, leakReversal)
         self.extensiontype_ = extensiontype_
     def factory(*args_, **kwargs_):
+        if CurrentSubclassModule_ is not None:
+            subclass = getSubclassFromModule_(
+                CurrentSubclassModule_, IaFTauCell)
+            if subclass is not None:
+                return subclass(*args_, **kwargs_)
         if IaFTauCell.subclass:
             return IaFTauCell.subclass(*args_, **kwargs_)
         else:
             return IaFTauCell(*args_, **kwargs_)
     factory = staticmethod(factory)
+    def get_leakReversal(self): return self.leakReversal
+    def set_leakReversal(self, leakReversal): self.leakReversal = leakReversal
+    def get_thresh(self): return self.thresh
+    def set_thresh(self, thresh): self.thresh = thresh
     def get_reset(self): return self.reset
     def set_reset(self, reset): self.reset = reset
-    def validate_Nml2Quantity_voltage(self, value):
-        # Validate type Nml2Quantity_voltage, a restriction on xs:string.
-        pass
     def get_tau(self): return self.tau
     def set_tau(self, tau): self.tau = tau
-    def validate_Nml2Quantity_time(self, value):
-        # Validate type Nml2Quantity_time, a restriction on xs:string.
-        pass
-    def get_thresh(self): return self.thresh
-    def set_thresh(self, thresh): self.thresh = thresh
-    def get_leakReversal(self): return self.leakReversal
-    def set_leakReversal(self, leakReversal): self.leakReversal = leakReversal
     def get_extensiontype_(self): return self.extensiontype_
     def set_extensiontype_(self, extensiontype_): self.extensiontype_ = extensiontype_
+    def validate_Nml2Quantity_voltage(self, value):
+        # Validate type Nml2Quantity_voltage, a restriction on xs:string.
+        if value is not None and Validate_simpletypes_:
+            if not self.gds_validate_simple_patterns(
+                    self.validate_Nml2Quantity_voltage_patterns_, value):
+                warnings_.warn('Value "%s" does not match xsd pattern restrictions: %s' % (value.encode('utf-8'), self.validate_Nml2Quantity_voltage_patterns_, ))
+    validate_Nml2Quantity_voltage_patterns_ = [['^-?([0-9]*(\\.[0-9]+)?)([eE]-?[0-9]+)?[\\s]*(V|mV)$']]
+    def validate_Nml2Quantity_time(self, value):
+        # Validate type Nml2Quantity_time, a restriction on xs:string.
+        if value is not None and Validate_simpletypes_:
+            if not self.gds_validate_simple_patterns(
+                    self.validate_Nml2Quantity_time_patterns_, value):
+                warnings_.warn('Value "%s" does not match xsd pattern restrictions: %s' % (value.encode('utf-8'), self.validate_Nml2Quantity_time_patterns_, ))
+    validate_Nml2Quantity_time_patterns_ = [['^-?([0-9]*(\\.[0-9]+)?)([eE]-?[0-9]+)?[\\s]*(s|ms)$']]
     def hasContent_(self):
         if (
             super(IaFTauCell, self).hasContent_()
@@ -12086,70 +10635,57 @@ class IaFTauCell(BaseCell):
             eol_ = '\n'
         else:
             eol_ = ''
+        if self.original_tagname_ is not None:
+            name_ = self.original_tagname_
         showIndent(outfile, level, pretty_print)
         outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
         already_processed = set()
         self.exportAttributes(outfile, level, already_processed, namespace_, name_='IaFTauCell')
         if self.hasContent_():
             outfile.write('>%s' % (eol_, ))
-            self.exportChildren(outfile, level + 1, namespace_, name_, pretty_print=pretty_print)
+            self.exportChildren(outfile, level + 1, namespace_='', name_='IaFTauCell', pretty_print=pretty_print)
             showIndent(outfile, level, pretty_print)
             outfile.write('</%s%s>%s' % (namespace_, name_, eol_))
         else:
             outfile.write('/>%s' % (eol_, ))
     def exportAttributes(self, outfile, level, already_processed, namespace_='', name_='IaFTauCell'):
         super(IaFTauCell, self).exportAttributes(outfile, level, already_processed, namespace_, name_='IaFTauCell')
+        if self.leakReversal is not None and 'leakReversal' not in already_processed:
+            already_processed.add('leakReversal')
+            outfile.write(' leakReversal=%s' % (quote_attrib(self.leakReversal), ))
+        if self.thresh is not None and 'thresh' not in already_processed:
+            already_processed.add('thresh')
+            outfile.write(' thresh=%s' % (quote_attrib(self.thresh), ))
         if self.reset is not None and 'reset' not in already_processed:
             already_processed.add('reset')
             outfile.write(' reset=%s' % (quote_attrib(self.reset), ))
         if self.tau is not None and 'tau' not in already_processed:
             already_processed.add('tau')
             outfile.write(' tau=%s' % (quote_attrib(self.tau), ))
-        if self.thresh is not None and 'thresh' not in already_processed:
-            already_processed.add('thresh')
-            outfile.write(' thresh=%s' % (quote_attrib(self.thresh), ))
-        if self.leakReversal is not None and 'leakReversal' not in already_processed:
-            already_processed.add('leakReversal')
-            outfile.write(' leakReversal=%s' % (quote_attrib(self.leakReversal), ))
         if self.extensiontype_ is not None and 'xsi:type' not in already_processed:
             already_processed.add('xsi:type')
             outfile.write(' xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"')
             outfile.write(' xsi:type="%s"' % self.extensiontype_)
     def exportChildren(self, outfile, level, namespace_='', name_='IaFTauCell', fromsubclass_=False, pretty_print=True):
         super(IaFTauCell, self).exportChildren(outfile, level, namespace_, name_, True, pretty_print=pretty_print)
-    def exportLiteral(self, outfile, level, name_='IaFTauCell'):
-        level += 1
-        already_processed = set()
-        self.exportLiteralAttributes(outfile, level, already_processed, name_)
-        if self.hasContent_():
-            self.exportLiteralChildren(outfile, level, name_)
-    def exportLiteralAttributes(self, outfile, level, already_processed, name_):
-        if self.reset is not None and 'reset' not in already_processed:
-            already_processed.add('reset')
-            showIndent(outfile, level)
-            outfile.write('reset="%s",\n' % (self.reset,))
-        if self.tau is not None and 'tau' not in already_processed:
-            already_processed.add('tau')
-            showIndent(outfile, level)
-            outfile.write('tau="%s",\n' % (self.tau,))
-        if self.thresh is not None and 'thresh' not in already_processed:
-            already_processed.add('thresh')
-            showIndent(outfile, level)
-            outfile.write('thresh="%s",\n' % (self.thresh,))
-        if self.leakReversal is not None and 'leakReversal' not in already_processed:
-            already_processed.add('leakReversal')
-            showIndent(outfile, level)
-            outfile.write('leakReversal="%s",\n' % (self.leakReversal,))
-        super(IaFTauCell, self).exportLiteralAttributes(outfile, level, already_processed, name_)
-    def exportLiteralChildren(self, outfile, level, name_):
-        super(IaFTauCell, self).exportLiteralChildren(outfile, level, name_)
     def build(self, node):
         already_processed = set()
         self.buildAttributes(node, node.attrib, already_processed)
         for child in node:
             nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
             self.buildChildren(child, node, nodeName_)
+        return self
     def buildAttributes(self, node, attrs, already_processed):
+        value = find_attr_value_('leakReversal', node)
+        if value is not None and 'leakReversal' not in already_processed:
+            already_processed.add('leakReversal')
+            self.leakReversal = value
+            self.validate_Nml2Quantity_voltage(self.leakReversal)    # validate type Nml2Quantity_voltage
+        value = find_attr_value_('thresh', node)
+        if value is not None and 'thresh' not in already_processed:
+            already_processed.add('thresh')
+            self.thresh = value
+            self.validate_Nml2Quantity_voltage(self.thresh)    # validate type Nml2Quantity_voltage
         value = find_attr_value_('reset', node)
         if value is not None and 'reset' not in already_processed:
             already_processed.add('reset')
@@ -12160,16 +10696,6 @@ class IaFTauCell(BaseCell):
             already_processed.add('tau')
             self.tau = value
             self.validate_Nml2Quantity_time(self.tau)    # validate type Nml2Quantity_time
-        value = find_attr_value_('thresh', node)
-        if value is not None and 'thresh' not in already_processed:
-            already_processed.add('thresh')
-            self.thresh = value
-            self.validate_Nml2Quantity_voltage(self.thresh)    # validate type Nml2Quantity_voltage
-        value = find_attr_value_('leakReversal', node)
-        if value is not None and 'leakReversal' not in already_processed:
-            already_processed.add('leakReversal')
-            self.leakReversal = value
-            self.validate_Nml2Quantity_voltage(self.leakReversal)    # validate type Nml2Quantity_voltage
         value = find_attr_value_('xsi:type', node)
         if value is not None and 'xsi:type' not in already_processed:
             already_processed.add('xsi:type')
@@ -12184,29 +10710,43 @@ class IaFTauCell(BaseCell):
 class BaseConductanceBasedSynapse(BaseSynapse):
     subclass = None
     superclass = BaseSynapse
-    def __init__(self, id=None, neuroLexId=None, name=None, metaid=None, notes=None, annotation=None, erev=None, gbase=None, extensiontype_=None):
+    def __init__(self, id=None, neuroLexId=None, name=None, metaid=None, notes=None, annotation=None, gbase=None, erev=None, extensiontype_=None):
+        self.original_tagname_ = None
         super(BaseConductanceBasedSynapse, self).__init__(id, neuroLexId, name, metaid, notes, annotation, extensiontype_, )
-        self.erev = _cast(None, erev)
         self.gbase = _cast(None, gbase)
+        self.erev = _cast(None, erev)
         self.extensiontype_ = extensiontype_
     def factory(*args_, **kwargs_):
+        if CurrentSubclassModule_ is not None:
+            subclass = getSubclassFromModule_(
+                CurrentSubclassModule_, BaseConductanceBasedSynapse)
+            if subclass is not None:
+                return subclass(*args_, **kwargs_)
         if BaseConductanceBasedSynapse.subclass:
             return BaseConductanceBasedSynapse.subclass(*args_, **kwargs_)
         else:
             return BaseConductanceBasedSynapse(*args_, **kwargs_)
     factory = staticmethod(factory)
-    def get_erev(self): return self.erev
-    def set_erev(self, erev): self.erev = erev
-    def validate_Nml2Quantity_voltage(self, value):
-        # Validate type Nml2Quantity_voltage, a restriction on xs:string.
-        pass
     def get_gbase(self): return self.gbase
     def set_gbase(self, gbase): self.gbase = gbase
-    def validate_Nml2Quantity_conductance(self, value):
-        # Validate type Nml2Quantity_conductance, a restriction on xs:string.
-        pass
+    def get_erev(self): return self.erev
+    def set_erev(self, erev): self.erev = erev
     def get_extensiontype_(self): return self.extensiontype_
     def set_extensiontype_(self, extensiontype_): self.extensiontype_ = extensiontype_
+    def validate_Nml2Quantity_conductance(self, value):
+        # Validate type Nml2Quantity_conductance, a restriction on xs:string.
+        if value is not None and Validate_simpletypes_:
+            if not self.gds_validate_simple_patterns(
+                    self.validate_Nml2Quantity_conductance_patterns_, value):
+                warnings_.warn('Value "%s" does not match xsd pattern restrictions: %s' % (value.encode('utf-8'), self.validate_Nml2Quantity_conductance_patterns_, ))
+    validate_Nml2Quantity_conductance_patterns_ = [['^-?([0-9]*(\\.[0-9]+)?)([eE]-?[0-9]+)?[\\s]*(S|mS|uS|nS|pS)$']]
+    def validate_Nml2Quantity_voltage(self, value):
+        # Validate type Nml2Quantity_voltage, a restriction on xs:string.
+        if value is not None and Validate_simpletypes_:
+            if not self.gds_validate_simple_patterns(
+                    self.validate_Nml2Quantity_voltage_patterns_, value):
+                warnings_.warn('Value "%s" does not match xsd pattern restrictions: %s' % (value.encode('utf-8'), self.validate_Nml2Quantity_voltage_patterns_, ))
+    validate_Nml2Quantity_voltage_patterns_ = [['^-?([0-9]*(\\.[0-9]+)?)([eE]-?[0-9]+)?[\\s]*(V|mV)$']]
     def hasContent_(self):
         if (
             super(BaseConductanceBasedSynapse, self).hasContent_()
@@ -12219,66 +10759,51 @@ class BaseConductanceBasedSynapse(BaseSynapse):
             eol_ = '\n'
         else:
             eol_ = ''
+        if self.original_tagname_ is not None:
+            name_ = self.original_tagname_
         showIndent(outfile, level, pretty_print)
         outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
         already_processed = set()
         self.exportAttributes(outfile, level, already_processed, namespace_, name_='BaseConductanceBasedSynapse')
         if self.hasContent_():
             outfile.write('>%s' % (eol_, ))
-            self.exportChildren(outfile, level + 1, namespace_, name_, pretty_print=pretty_print)
+            self.exportChildren(outfile, level + 1, namespace_='', name_='BaseConductanceBasedSynapse', pretty_print=pretty_print)
             showIndent(outfile, level, pretty_print)
             outfile.write('</%s%s>%s' % (namespace_, name_, eol_))
         else:
             outfile.write('/>%s' % (eol_, ))
     def exportAttributes(self, outfile, level, already_processed, namespace_='', name_='BaseConductanceBasedSynapse'):
         super(BaseConductanceBasedSynapse, self).exportAttributes(outfile, level, already_processed, namespace_, name_='BaseConductanceBasedSynapse')
-        if self.erev is not None and 'erev' not in already_processed:
-            already_processed.add('erev')
-            outfile.write(' erev=%s' % (quote_attrib(self.erev), ))
         if self.gbase is not None and 'gbase' not in already_processed:
             already_processed.add('gbase')
             outfile.write(' gbase=%s' % (quote_attrib(self.gbase), ))
+        if self.erev is not None and 'erev' not in already_processed:
+            already_processed.add('erev')
+            outfile.write(' erev=%s' % (quote_attrib(self.erev), ))
         if self.extensiontype_ is not None and 'xsi:type' not in already_processed:
             already_processed.add('xsi:type')
             outfile.write(' xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"')
             outfile.write(' xsi:type="%s"' % self.extensiontype_)
     def exportChildren(self, outfile, level, namespace_='', name_='BaseConductanceBasedSynapse', fromsubclass_=False, pretty_print=True):
         super(BaseConductanceBasedSynapse, self).exportChildren(outfile, level, namespace_, name_, True, pretty_print=pretty_print)
-    def exportLiteral(self, outfile, level, name_='BaseConductanceBasedSynapse'):
-        level += 1
-        already_processed = set()
-        self.exportLiteralAttributes(outfile, level, already_processed, name_)
-        if self.hasContent_():
-            self.exportLiteralChildren(outfile, level, name_)
-    def exportLiteralAttributes(self, outfile, level, already_processed, name_):
-        if self.erev is not None and 'erev' not in already_processed:
-            already_processed.add('erev')
-            showIndent(outfile, level)
-            outfile.write('erev="%s",\n' % (self.erev,))
-        if self.gbase is not None and 'gbase' not in already_processed:
-            already_processed.add('gbase')
-            showIndent(outfile, level)
-            outfile.write('gbase="%s",\n' % (self.gbase,))
-        super(BaseConductanceBasedSynapse, self).exportLiteralAttributes(outfile, level, already_processed, name_)
-    def exportLiteralChildren(self, outfile, level, name_):
-        super(BaseConductanceBasedSynapse, self).exportLiteralChildren(outfile, level, name_)
     def build(self, node):
         already_processed = set()
         self.buildAttributes(node, node.attrib, already_processed)
         for child in node:
             nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
             self.buildChildren(child, node, nodeName_)
+        return self
     def buildAttributes(self, node, attrs, already_processed):
-        value = find_attr_value_('erev', node)
-        if value is not None and 'erev' not in already_processed:
-            already_processed.add('erev')
-            self.erev = value
-            self.validate_Nml2Quantity_voltage(self.erev)    # validate type Nml2Quantity_voltage
         value = find_attr_value_('gbase', node)
         if value is not None and 'gbase' not in already_processed:
             already_processed.add('gbase')
             self.gbase = value
             self.validate_Nml2Quantity_conductance(self.gbase)    # validate type Nml2Quantity_conductance
+        value = find_attr_value_('erev', node)
+        if value is not None and 'erev' not in already_processed:
+            already_processed.add('erev')
+            self.erev = value
+            self.validate_Nml2Quantity_voltage(self.erev)    # validate type Nml2Quantity_voltage
         value = find_attr_value_('xsi:type', node)
         if value is not None and 'xsi:type' not in already_processed:
             already_processed.add('xsi:type')
@@ -12294,9 +10819,14 @@ class AlphaCurrSynapse(BasePynnSynapse):
     subclass = None
     superclass = BasePynnSynapse
     def __init__(self, id=None, neuroLexId=None, name=None, metaid=None, notes=None, annotation=None, tau_syn=None):
+        self.original_tagname_ = None
         super(AlphaCurrSynapse, self).__init__(id, neuroLexId, name, metaid, notes, annotation, tau_syn, )
-        pass
     def factory(*args_, **kwargs_):
+        if CurrentSubclassModule_ is not None:
+            subclass = getSubclassFromModule_(
+                CurrentSubclassModule_, AlphaCurrSynapse)
+            if subclass is not None:
+                return subclass(*args_, **kwargs_)
         if AlphaCurrSynapse.subclass:
             return AlphaCurrSynapse.subclass(*args_, **kwargs_)
         else:
@@ -12314,13 +10844,15 @@ class AlphaCurrSynapse(BasePynnSynapse):
             eol_ = '\n'
         else:
             eol_ = ''
+        if self.original_tagname_ is not None:
+            name_ = self.original_tagname_
         showIndent(outfile, level, pretty_print)
         outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
         already_processed = set()
         self.exportAttributes(outfile, level, already_processed, namespace_, name_='AlphaCurrSynapse')
         if self.hasContent_():
             outfile.write('>%s' % (eol_, ))
-            self.exportChildren(outfile, level + 1, namespace_, name_, pretty_print=pretty_print)
+            self.exportChildren(outfile, level + 1, namespace_='', name_='AlphaCurrSynapse', pretty_print=pretty_print)
             showIndent(outfile, level, pretty_print)
             outfile.write('</%s%s>%s' % (namespace_, name_, eol_))
         else:
@@ -12329,22 +10861,13 @@ class AlphaCurrSynapse(BasePynnSynapse):
         super(AlphaCurrSynapse, self).exportAttributes(outfile, level, already_processed, namespace_, name_='AlphaCurrSynapse')
     def exportChildren(self, outfile, level, namespace_='', name_='AlphaCurrSynapse', fromsubclass_=False, pretty_print=True):
         super(AlphaCurrSynapse, self).exportChildren(outfile, level, namespace_, name_, True, pretty_print=pretty_print)
-    def exportLiteral(self, outfile, level, name_='AlphaCurrSynapse'):
-        level += 1
-        already_processed = set()
-        self.exportLiteralAttributes(outfile, level, already_processed, name_)
-        if self.hasContent_():
-            self.exportLiteralChildren(outfile, level, name_)
-    def exportLiteralAttributes(self, outfile, level, already_processed, name_):
-        super(AlphaCurrSynapse, self).exportLiteralAttributes(outfile, level, already_processed, name_)
-    def exportLiteralChildren(self, outfile, level, name_):
-        super(AlphaCurrSynapse, self).exportLiteralChildren(outfile, level, name_)
     def build(self, node):
         already_processed = set()
         self.buildAttributes(node, node.attrib, already_processed)
         for child in node:
             nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
             self.buildChildren(child, node, nodeName_)
+        return self
     def buildAttributes(self, node, attrs, already_processed):
         super(AlphaCurrSynapse, self).buildAttributes(node, attrs, already_processed)
     def buildChildren(self, child_, node, nodeName_, fromsubclass_=False):
@@ -12357,9 +10880,14 @@ class ExpCurrSynapse(BasePynnSynapse):
     subclass = None
     superclass = BasePynnSynapse
     def __init__(self, id=None, neuroLexId=None, name=None, metaid=None, notes=None, annotation=None, tau_syn=None):
+        self.original_tagname_ = None
         super(ExpCurrSynapse, self).__init__(id, neuroLexId, name, metaid, notes, annotation, tau_syn, )
-        pass
     def factory(*args_, **kwargs_):
+        if CurrentSubclassModule_ is not None:
+            subclass = getSubclassFromModule_(
+                CurrentSubclassModule_, ExpCurrSynapse)
+            if subclass is not None:
+                return subclass(*args_, **kwargs_)
         if ExpCurrSynapse.subclass:
             return ExpCurrSynapse.subclass(*args_, **kwargs_)
         else:
@@ -12377,13 +10905,15 @@ class ExpCurrSynapse(BasePynnSynapse):
             eol_ = '\n'
         else:
             eol_ = ''
+        if self.original_tagname_ is not None:
+            name_ = self.original_tagname_
         showIndent(outfile, level, pretty_print)
         outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
         already_processed = set()
         self.exportAttributes(outfile, level, already_processed, namespace_, name_='ExpCurrSynapse')
         if self.hasContent_():
             outfile.write('>%s' % (eol_, ))
-            self.exportChildren(outfile, level + 1, namespace_, name_, pretty_print=pretty_print)
+            self.exportChildren(outfile, level + 1, namespace_='', name_='ExpCurrSynapse', pretty_print=pretty_print)
             showIndent(outfile, level, pretty_print)
             outfile.write('</%s%s>%s' % (namespace_, name_, eol_))
         else:
@@ -12392,22 +10922,13 @@ class ExpCurrSynapse(BasePynnSynapse):
         super(ExpCurrSynapse, self).exportAttributes(outfile, level, already_processed, namespace_, name_='ExpCurrSynapse')
     def exportChildren(self, outfile, level, namespace_='', name_='ExpCurrSynapse', fromsubclass_=False, pretty_print=True):
         super(ExpCurrSynapse, self).exportChildren(outfile, level, namespace_, name_, True, pretty_print=pretty_print)
-    def exportLiteral(self, outfile, level, name_='ExpCurrSynapse'):
-        level += 1
-        already_processed = set()
-        self.exportLiteralAttributes(outfile, level, already_processed, name_)
-        if self.hasContent_():
-            self.exportLiteralChildren(outfile, level, name_)
-    def exportLiteralAttributes(self, outfile, level, already_processed, name_):
-        super(ExpCurrSynapse, self).exportLiteralAttributes(outfile, level, already_processed, name_)
-    def exportLiteralChildren(self, outfile, level, name_):
-        super(ExpCurrSynapse, self).exportLiteralChildren(outfile, level, name_)
     def build(self, node):
         already_processed = set()
         self.buildAttributes(node, node.attrib, already_processed)
         for child in node:
             nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
             self.buildChildren(child, node, nodeName_)
+        return self
     def buildAttributes(self, node, attrs, already_processed):
         super(ExpCurrSynapse, self).buildAttributes(node, attrs, already_processed)
     def buildChildren(self, child_, node, nodeName_, fromsubclass_=False):
@@ -12420,10 +10941,15 @@ class AlphaCondSynapse(BasePynnSynapse):
     subclass = None
     superclass = BasePynnSynapse
     def __init__(self, id=None, neuroLexId=None, name=None, metaid=None, notes=None, annotation=None, tau_syn=None, e_rev=None):
+        self.original_tagname_ = None
         super(AlphaCondSynapse, self).__init__(id, neuroLexId, name, metaid, notes, annotation, tau_syn, )
         self.e_rev = _cast(float, e_rev)
-        pass
     def factory(*args_, **kwargs_):
+        if CurrentSubclassModule_ is not None:
+            subclass = getSubclassFromModule_(
+                CurrentSubclassModule_, AlphaCondSynapse)
+            if subclass is not None:
+                return subclass(*args_, **kwargs_)
         if AlphaCondSynapse.subclass:
             return AlphaCondSynapse.subclass(*args_, **kwargs_)
         else:
@@ -12443,13 +10969,15 @@ class AlphaCondSynapse(BasePynnSynapse):
             eol_ = '\n'
         else:
             eol_ = ''
+        if self.original_tagname_ is not None:
+            name_ = self.original_tagname_
         showIndent(outfile, level, pretty_print)
         outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
         already_processed = set()
         self.exportAttributes(outfile, level, already_processed, namespace_, name_='AlphaCondSynapse')
         if self.hasContent_():
             outfile.write('>%s' % (eol_, ))
-            self.exportChildren(outfile, level + 1, namespace_, name_, pretty_print=pretty_print)
+            self.exportChildren(outfile, level + 1, namespace_='', name_='AlphaCondSynapse', pretty_print=pretty_print)
             showIndent(outfile, level, pretty_print)
             outfile.write('</%s%s>%s' % (namespace_, name_, eol_))
         else:
@@ -12461,26 +10989,13 @@ class AlphaCondSynapse(BasePynnSynapse):
             outfile.write(' e_rev="%s"' % self.gds_format_double(self.e_rev, input_name='e_rev'))
     def exportChildren(self, outfile, level, namespace_='', name_='AlphaCondSynapse', fromsubclass_=False, pretty_print=True):
         super(AlphaCondSynapse, self).exportChildren(outfile, level, namespace_, name_, True, pretty_print=pretty_print)
-    def exportLiteral(self, outfile, level, name_='AlphaCondSynapse'):
-        level += 1
-        already_processed = set()
-        self.exportLiteralAttributes(outfile, level, already_processed, name_)
-        if self.hasContent_():
-            self.exportLiteralChildren(outfile, level, name_)
-    def exportLiteralAttributes(self, outfile, level, already_processed, name_):
-        if self.e_rev is not None and 'e_rev' not in already_processed:
-            already_processed.add('e_rev')
-            showIndent(outfile, level)
-            outfile.write('e_rev=%e,\n' % (self.e_rev,))
-        super(AlphaCondSynapse, self).exportLiteralAttributes(outfile, level, already_processed, name_)
-    def exportLiteralChildren(self, outfile, level, name_):
-        super(AlphaCondSynapse, self).exportLiteralChildren(outfile, level, name_)
     def build(self, node):
         already_processed = set()
         self.buildAttributes(node, node.attrib, already_processed)
         for child in node:
             nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
             self.buildChildren(child, node, nodeName_)
+        return self
     def buildAttributes(self, node, attrs, already_processed):
         value = find_attr_value_('e_rev', node)
         if value is not None and 'e_rev' not in already_processed:
@@ -12500,10 +11015,15 @@ class ExpCondSynapse(BasePynnSynapse):
     subclass = None
     superclass = BasePynnSynapse
     def __init__(self, id=None, neuroLexId=None, name=None, metaid=None, notes=None, annotation=None, tau_syn=None, e_rev=None):
+        self.original_tagname_ = None
         super(ExpCondSynapse, self).__init__(id, neuroLexId, name, metaid, notes, annotation, tau_syn, )
         self.e_rev = _cast(float, e_rev)
-        pass
     def factory(*args_, **kwargs_):
+        if CurrentSubclassModule_ is not None:
+            subclass = getSubclassFromModule_(
+                CurrentSubclassModule_, ExpCondSynapse)
+            if subclass is not None:
+                return subclass(*args_, **kwargs_)
         if ExpCondSynapse.subclass:
             return ExpCondSynapse.subclass(*args_, **kwargs_)
         else:
@@ -12523,13 +11043,15 @@ class ExpCondSynapse(BasePynnSynapse):
             eol_ = '\n'
         else:
             eol_ = ''
+        if self.original_tagname_ is not None:
+            name_ = self.original_tagname_
         showIndent(outfile, level, pretty_print)
         outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
         already_processed = set()
         self.exportAttributes(outfile, level, already_processed, namespace_, name_='ExpCondSynapse')
         if self.hasContent_():
             outfile.write('>%s' % (eol_, ))
-            self.exportChildren(outfile, level + 1, namespace_, name_, pretty_print=pretty_print)
+            self.exportChildren(outfile, level + 1, namespace_='', name_='ExpCondSynapse', pretty_print=pretty_print)
             showIndent(outfile, level, pretty_print)
             outfile.write('</%s%s>%s' % (namespace_, name_, eol_))
         else:
@@ -12541,26 +11063,13 @@ class ExpCondSynapse(BasePynnSynapse):
             outfile.write(' e_rev="%s"' % self.gds_format_double(self.e_rev, input_name='e_rev'))
     def exportChildren(self, outfile, level, namespace_='', name_='ExpCondSynapse', fromsubclass_=False, pretty_print=True):
         super(ExpCondSynapse, self).exportChildren(outfile, level, namespace_, name_, True, pretty_print=pretty_print)
-    def exportLiteral(self, outfile, level, name_='ExpCondSynapse'):
-        level += 1
-        already_processed = set()
-        self.exportLiteralAttributes(outfile, level, already_processed, name_)
-        if self.hasContent_():
-            self.exportLiteralChildren(outfile, level, name_)
-    def exportLiteralAttributes(self, outfile, level, already_processed, name_):
-        if self.e_rev is not None and 'e_rev' not in already_processed:
-            already_processed.add('e_rev')
-            showIndent(outfile, level)
-            outfile.write('e_rev=%e,\n' % (self.e_rev,))
-        super(ExpCondSynapse, self).exportLiteralAttributes(outfile, level, already_processed, name_)
-    def exportLiteralChildren(self, outfile, level, name_):
-        super(ExpCondSynapse, self).exportLiteralChildren(outfile, level, name_)
     def build(self, node):
         already_processed = set()
         self.buildAttributes(node, node.attrib, already_processed)
         for child in node:
             nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
             self.buildChildren(child, node, nodeName_)
+        return self
     def buildAttributes(self, node, attrs, already_processed):
         value = find_attr_value_('e_rev', node)
         if value is not None and 'e_rev' not in already_processed:
@@ -12579,40 +11088,45 @@ class ExpCondSynapse(BasePynnSynapse):
 class HH_cond_exp(basePyNNCell):
     subclass = None
     superclass = basePyNNCell
-    def __init__(self, id=None, neuroLexId=None, name=None, metaid=None, notes=None, annotation=None, tau_syn_I=None, tau_syn_E=None, i_offset=None, cm=None, v_init=None, gbar_K=None, e_rev_E=None, g_leak=None, e_rev_Na=None, e_rev_I=None, e_rev_K=None, e_rev_leak=None, v_offset=None, gbar_Na=None):
-        super(HH_cond_exp, self).__init__(id, neuroLexId, name, metaid, notes, annotation, tau_syn_I, tau_syn_E, i_offset, cm, v_init, )
-        self.gbar_K = _cast(float, gbar_K)
+    def __init__(self, id=None, neuroLexId=None, name=None, metaid=None, notes=None, annotation=None, cm=None, i_offset=None, tau_syn_E=None, tau_syn_I=None, v_init=None, v_offset=None, e_rev_E=None, e_rev_I=None, e_rev_K=None, e_rev_Na=None, e_rev_leak=None, g_leak=None, gbar_K=None, gbar_Na=None):
+        self.original_tagname_ = None
+        super(HH_cond_exp, self).__init__(id, neuroLexId, name, metaid, notes, annotation, cm, i_offset, tau_syn_E, tau_syn_I, v_init, )
+        self.v_offset = _cast(float, v_offset)
         self.e_rev_E = _cast(float, e_rev_E)
-        self.g_leak = _cast(float, g_leak)
-        self.e_rev_Na = _cast(float, e_rev_Na)
         self.e_rev_I = _cast(float, e_rev_I)
         self.e_rev_K = _cast(float, e_rev_K)
+        self.e_rev_Na = _cast(float, e_rev_Na)
         self.e_rev_leak = _cast(float, e_rev_leak)
-        self.v_offset = _cast(float, v_offset)
+        self.g_leak = _cast(float, g_leak)
+        self.gbar_K = _cast(float, gbar_K)
         self.gbar_Na = _cast(float, gbar_Na)
-        pass
     def factory(*args_, **kwargs_):
+        if CurrentSubclassModule_ is not None:
+            subclass = getSubclassFromModule_(
+                CurrentSubclassModule_, HH_cond_exp)
+            if subclass is not None:
+                return subclass(*args_, **kwargs_)
         if HH_cond_exp.subclass:
             return HH_cond_exp.subclass(*args_, **kwargs_)
         else:
             return HH_cond_exp(*args_, **kwargs_)
     factory = staticmethod(factory)
-    def get_gbar_K(self): return self.gbar_K
-    def set_gbar_K(self, gbar_K): self.gbar_K = gbar_K
+    def get_v_offset(self): return self.v_offset
+    def set_v_offset(self, v_offset): self.v_offset = v_offset
     def get_e_rev_E(self): return self.e_rev_E
     def set_e_rev_E(self, e_rev_E): self.e_rev_E = e_rev_E
-    def get_g_leak(self): return self.g_leak
-    def set_g_leak(self, g_leak): self.g_leak = g_leak
-    def get_e_rev_Na(self): return self.e_rev_Na
-    def set_e_rev_Na(self, e_rev_Na): self.e_rev_Na = e_rev_Na
     def get_e_rev_I(self): return self.e_rev_I
     def set_e_rev_I(self, e_rev_I): self.e_rev_I = e_rev_I
     def get_e_rev_K(self): return self.e_rev_K
     def set_e_rev_K(self, e_rev_K): self.e_rev_K = e_rev_K
+    def get_e_rev_Na(self): return self.e_rev_Na
+    def set_e_rev_Na(self, e_rev_Na): self.e_rev_Na = e_rev_Na
     def get_e_rev_leak(self): return self.e_rev_leak
     def set_e_rev_leak(self, e_rev_leak): self.e_rev_leak = e_rev_leak
-    def get_v_offset(self): return self.v_offset
-    def set_v_offset(self, v_offset): self.v_offset = v_offset
+    def get_g_leak(self): return self.g_leak
+    def set_g_leak(self, g_leak): self.g_leak = g_leak
+    def get_gbar_K(self): return self.gbar_K
+    def set_gbar_K(self, gbar_K): self.gbar_K = gbar_K
     def get_gbar_Na(self): return self.gbar_Na
     def set_gbar_Na(self, gbar_Na): self.gbar_Na = gbar_Na
     def hasContent_(self):
@@ -12627,108 +11141,65 @@ class HH_cond_exp(basePyNNCell):
             eol_ = '\n'
         else:
             eol_ = ''
+        if self.original_tagname_ is not None:
+            name_ = self.original_tagname_
         showIndent(outfile, level, pretty_print)
         outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
         already_processed = set()
         self.exportAttributes(outfile, level, already_processed, namespace_, name_='HH_cond_exp')
         if self.hasContent_():
             outfile.write('>%s' % (eol_, ))
-            self.exportChildren(outfile, level + 1, namespace_, name_, pretty_print=pretty_print)
+            self.exportChildren(outfile, level + 1, namespace_='', name_='HH_cond_exp', pretty_print=pretty_print)
             showIndent(outfile, level, pretty_print)
             outfile.write('</%s%s>%s' % (namespace_, name_, eol_))
         else:
             outfile.write('/>%s' % (eol_, ))
     def exportAttributes(self, outfile, level, already_processed, namespace_='', name_='HH_cond_exp'):
         super(HH_cond_exp, self).exportAttributes(outfile, level, already_processed, namespace_, name_='HH_cond_exp')
-        if self.gbar_K is not None and 'gbar_K' not in already_processed:
-            already_processed.add('gbar_K')
-            outfile.write(' gbar_K="%s"' % self.gds_format_double(self.gbar_K, input_name='gbar_K'))
-        if self.e_rev_E is not None and 'e_rev_E' not in already_processed:
-            already_processed.add('e_rev_E')
-            outfile.write(' e_rev_E="%s"' % self.gds_format_double(self.e_rev_E, input_name='e_rev_E'))
-        if self.g_leak is not None and 'g_leak' not in already_processed:
-            already_processed.add('g_leak')
-            outfile.write(' g_leak="%s"' % self.gds_format_double(self.g_leak, input_name='g_leak'))
-        if self.e_rev_Na is not None and 'e_rev_Na' not in already_processed:
-            already_processed.add('e_rev_Na')
-            outfile.write(' e_rev_Na="%s"' % self.gds_format_double(self.e_rev_Na, input_name='e_rev_Na'))
-        if self.e_rev_I is not None and 'e_rev_I' not in already_processed:
-            already_processed.add('e_rev_I')
-            outfile.write(' e_rev_I="%s"' % self.gds_format_double(self.e_rev_I, input_name='e_rev_I'))
-        if self.e_rev_K is not None and 'e_rev_K' not in already_processed:
-            already_processed.add('e_rev_K')
-            outfile.write(' e_rev_K="%s"' % self.gds_format_double(self.e_rev_K, input_name='e_rev_K'))
-        if self.e_rev_leak is not None and 'e_rev_leak' not in already_processed:
-            already_processed.add('e_rev_leak')
-            outfile.write(' e_rev_leak="%s"' % self.gds_format_double(self.e_rev_leak, input_name='e_rev_leak'))
         if self.v_offset is not None and 'v_offset' not in already_processed:
             already_processed.add('v_offset')
             outfile.write(' v_offset="%s"' % self.gds_format_double(self.v_offset, input_name='v_offset'))
-        if self.gbar_Na is not None and 'gbar_Na' not in already_processed:
-            already_processed.add('gbar_Na')
-            outfile.write(' gbar_Na="%s"' % self.gds_format_double(self.gbar_Na, input_name='gbar_Na'))
-    def exportChildren(self, outfile, level, namespace_='', name_='HH_cond_exp', fromsubclass_=False, pretty_print=True):
-        super(HH_cond_exp, self).exportChildren(outfile, level, namespace_, name_, True, pretty_print=pretty_print)
-    def exportLiteral(self, outfile, level, name_='HH_cond_exp'):
-        level += 1
-        already_processed = set()
-        self.exportLiteralAttributes(outfile, level, already_processed, name_)
-        if self.hasContent_():
-            self.exportLiteralChildren(outfile, level, name_)
-    def exportLiteralAttributes(self, outfile, level, already_processed, name_):
-        if self.gbar_K is not None and 'gbar_K' not in already_processed:
-            already_processed.add('gbar_K')
-            showIndent(outfile, level)
-            outfile.write('gbar_K=%e,\n' % (self.gbar_K,))
         if self.e_rev_E is not None and 'e_rev_E' not in already_processed:
             already_processed.add('e_rev_E')
-            showIndent(outfile, level)
-            outfile.write('e_rev_E=%e,\n' % (self.e_rev_E,))
-        if self.g_leak is not None and 'g_leak' not in already_processed:
-            already_processed.add('g_leak')
-            showIndent(outfile, level)
-            outfile.write('g_leak=%e,\n' % (self.g_leak,))
-        if self.e_rev_Na is not None and 'e_rev_Na' not in already_processed:
-            already_processed.add('e_rev_Na')
-            showIndent(outfile, level)
-            outfile.write('e_rev_Na=%e,\n' % (self.e_rev_Na,))
+            outfile.write(' e_rev_E="%s"' % self.gds_format_double(self.e_rev_E, input_name='e_rev_E'))
         if self.e_rev_I is not None and 'e_rev_I' not in already_processed:
             already_processed.add('e_rev_I')
-            showIndent(outfile, level)
-            outfile.write('e_rev_I=%e,\n' % (self.e_rev_I,))
+            outfile.write(' e_rev_I="%s"' % self.gds_format_double(self.e_rev_I, input_name='e_rev_I'))
         if self.e_rev_K is not None and 'e_rev_K' not in already_processed:
             already_processed.add('e_rev_K')
-            showIndent(outfile, level)
-            outfile.write('e_rev_K=%e,\n' % (self.e_rev_K,))
+            outfile.write(' e_rev_K="%s"' % self.gds_format_double(self.e_rev_K, input_name='e_rev_K'))
+        if self.e_rev_Na is not None and 'e_rev_Na' not in already_processed:
+            already_processed.add('e_rev_Na')
+            outfile.write(' e_rev_Na="%s"' % self.gds_format_double(self.e_rev_Na, input_name='e_rev_Na'))
         if self.e_rev_leak is not None and 'e_rev_leak' not in already_processed:
             already_processed.add('e_rev_leak')
-            showIndent(outfile, level)
-            outfile.write('e_rev_leak=%e,\n' % (self.e_rev_leak,))
-        if self.v_offset is not None and 'v_offset' not in already_processed:
-            already_processed.add('v_offset')
-            showIndent(outfile, level)
-            outfile.write('v_offset=%e,\n' % (self.v_offset,))
+            outfile.write(' e_rev_leak="%s"' % self.gds_format_double(self.e_rev_leak, input_name='e_rev_leak'))
+        if self.g_leak is not None and 'g_leak' not in already_processed:
+            already_processed.add('g_leak')
+            outfile.write(' g_leak="%s"' % self.gds_format_double(self.g_leak, input_name='g_leak'))
+        if self.gbar_K is not None and 'gbar_K' not in already_processed:
+            already_processed.add('gbar_K')
+            outfile.write(' gbar_K="%s"' % self.gds_format_double(self.gbar_K, input_name='gbar_K'))
         if self.gbar_Na is not None and 'gbar_Na' not in already_processed:
             already_processed.add('gbar_Na')
-            showIndent(outfile, level)
-            outfile.write('gbar_Na=%e,\n' % (self.gbar_Na,))
-        super(HH_cond_exp, self).exportLiteralAttributes(outfile, level, already_processed, name_)
-    def exportLiteralChildren(self, outfile, level, name_):
-        super(HH_cond_exp, self).exportLiteralChildren(outfile, level, name_)
+            outfile.write(' gbar_Na="%s"' % self.gds_format_double(self.gbar_Na, input_name='gbar_Na'))
+    def exportChildren(self, outfile, level, namespace_='', name_='HH_cond_exp', fromsubclass_=False, pretty_print=True):
+        super(HH_cond_exp, self).exportChildren(outfile, level, namespace_, name_, True, pretty_print=pretty_print)
     def build(self, node):
         already_processed = set()
         self.buildAttributes(node, node.attrib, already_processed)
         for child in node:
             nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
             self.buildChildren(child, node, nodeName_)
+        return self
     def buildAttributes(self, node, attrs, already_processed):
-        value = find_attr_value_('gbar_K', node)
-        if value is not None and 'gbar_K' not in already_processed:
-            already_processed.add('gbar_K')
+        value = find_attr_value_('v_offset', node)
+        if value is not None and 'v_offset' not in already_processed:
+            already_processed.add('v_offset')
             try:
-                self.gbar_K = float(value)
+                self.v_offset = float(value)
             except ValueError as exp:
-                raise ValueError('Bad float/double attribute (gbar_K): %s' % exp)
+                raise ValueError('Bad float/double attribute (v_offset): %s' % exp)
         value = find_attr_value_('e_rev_E', node)
         if value is not None and 'e_rev_E' not in already_processed:
             already_processed.add('e_rev_E')
@@ -12736,20 +11207,6 @@ class HH_cond_exp(basePyNNCell):
                 self.e_rev_E = float(value)
             except ValueError as exp:
                 raise ValueError('Bad float/double attribute (e_rev_E): %s' % exp)
-        value = find_attr_value_('g_leak', node)
-        if value is not None and 'g_leak' not in already_processed:
-            already_processed.add('g_leak')
-            try:
-                self.g_leak = float(value)
-            except ValueError as exp:
-                raise ValueError('Bad float/double attribute (g_leak): %s' % exp)
-        value = find_attr_value_('e_rev_Na', node)
-        if value is not None and 'e_rev_Na' not in already_processed:
-            already_processed.add('e_rev_Na')
-            try:
-                self.e_rev_Na = float(value)
-            except ValueError as exp:
-                raise ValueError('Bad float/double attribute (e_rev_Na): %s' % exp)
         value = find_attr_value_('e_rev_I', node)
         if value is not None and 'e_rev_I' not in already_processed:
             already_processed.add('e_rev_I')
@@ -12764,6 +11221,13 @@ class HH_cond_exp(basePyNNCell):
                 self.e_rev_K = float(value)
             except ValueError as exp:
                 raise ValueError('Bad float/double attribute (e_rev_K): %s' % exp)
+        value = find_attr_value_('e_rev_Na', node)
+        if value is not None and 'e_rev_Na' not in already_processed:
+            already_processed.add('e_rev_Na')
+            try:
+                self.e_rev_Na = float(value)
+            except ValueError as exp:
+                raise ValueError('Bad float/double attribute (e_rev_Na): %s' % exp)
         value = find_attr_value_('e_rev_leak', node)
         if value is not None and 'e_rev_leak' not in already_processed:
             already_processed.add('e_rev_leak')
@@ -12771,13 +11235,20 @@ class HH_cond_exp(basePyNNCell):
                 self.e_rev_leak = float(value)
             except ValueError as exp:
                 raise ValueError('Bad float/double attribute (e_rev_leak): %s' % exp)
-        value = find_attr_value_('v_offset', node)
-        if value is not None and 'v_offset' not in already_processed:
-            already_processed.add('v_offset')
+        value = find_attr_value_('g_leak', node)
+        if value is not None and 'g_leak' not in already_processed:
+            already_processed.add('g_leak')
             try:
-                self.v_offset = float(value)
+                self.g_leak = float(value)
             except ValueError as exp:
-                raise ValueError('Bad float/double attribute (v_offset): %s' % exp)
+                raise ValueError('Bad float/double attribute (g_leak): %s' % exp)
+        value = find_attr_value_('gbar_K', node)
+        if value is not None and 'gbar_K' not in already_processed:
+            already_processed.add('gbar_K')
+            try:
+                self.gbar_K = float(value)
+            except ValueError as exp:
+                raise ValueError('Bad float/double attribute (gbar_K): %s' % exp)
         value = find_attr_value_('gbar_Na', node)
         if value is not None and 'gbar_Na' not in already_processed:
             already_processed.add('gbar_Na')
@@ -12795,30 +11266,36 @@ class HH_cond_exp(basePyNNCell):
 class basePyNNIaFCell(basePyNNCell):
     subclass = None
     superclass = basePyNNCell
-    def __init__(self, id=None, neuroLexId=None, name=None, metaid=None, notes=None, annotation=None, tau_syn_I=None, tau_syn_E=None, i_offset=None, cm=None, v_init=None, tau_refrac=None, v_thresh=None, tau_m=None, v_reset=None, v_rest=None, extensiontype_=None):
-        super(basePyNNIaFCell, self).__init__(id, neuroLexId, name, metaid, notes, annotation, tau_syn_I, tau_syn_E, i_offset, cm, v_init, extensiontype_, )
-        self.tau_refrac = _cast(float, tau_refrac)
-        self.v_thresh = _cast(float, v_thresh)
+    def __init__(self, id=None, neuroLexId=None, name=None, metaid=None, notes=None, annotation=None, cm=None, i_offset=None, tau_syn_E=None, tau_syn_I=None, v_init=None, tau_m=None, tau_refrac=None, v_reset=None, v_rest=None, v_thresh=None, extensiontype_=None):
+        self.original_tagname_ = None
+        super(basePyNNIaFCell, self).__init__(id, neuroLexId, name, metaid, notes, annotation, cm, i_offset, tau_syn_E, tau_syn_I, v_init, extensiontype_, )
         self.tau_m = _cast(float, tau_m)
+        self.tau_refrac = _cast(float, tau_refrac)
         self.v_reset = _cast(float, v_reset)
         self.v_rest = _cast(float, v_rest)
+        self.v_thresh = _cast(float, v_thresh)
         self.extensiontype_ = extensiontype_
     def factory(*args_, **kwargs_):
+        if CurrentSubclassModule_ is not None:
+            subclass = getSubclassFromModule_(
+                CurrentSubclassModule_, basePyNNIaFCell)
+            if subclass is not None:
+                return subclass(*args_, **kwargs_)
         if basePyNNIaFCell.subclass:
             return basePyNNIaFCell.subclass(*args_, **kwargs_)
         else:
             return basePyNNIaFCell(*args_, **kwargs_)
     factory = staticmethod(factory)
-    def get_tau_refrac(self): return self.tau_refrac
-    def set_tau_refrac(self, tau_refrac): self.tau_refrac = tau_refrac
-    def get_v_thresh(self): return self.v_thresh
-    def set_v_thresh(self, v_thresh): self.v_thresh = v_thresh
     def get_tau_m(self): return self.tau_m
     def set_tau_m(self, tau_m): self.tau_m = tau_m
+    def get_tau_refrac(self): return self.tau_refrac
+    def set_tau_refrac(self, tau_refrac): self.tau_refrac = tau_refrac
     def get_v_reset(self): return self.v_reset
     def set_v_reset(self, v_reset): self.v_reset = v_reset
     def get_v_rest(self): return self.v_rest
     def set_v_rest(self, v_rest): self.v_rest = v_rest
+    def get_v_thresh(self): return self.v_thresh
+    def set_v_thresh(self, v_thresh): self.v_thresh = v_thresh
     def get_extensiontype_(self): return self.extensiontype_
     def set_extensiontype_(self, extensiontype_): self.extensiontype_ = extensiontype_
     def hasContent_(self):
@@ -12833,91 +11310,50 @@ class basePyNNIaFCell(basePyNNCell):
             eol_ = '\n'
         else:
             eol_ = ''
+        if self.original_tagname_ is not None:
+            name_ = self.original_tagname_
         showIndent(outfile, level, pretty_print)
         outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
         already_processed = set()
         self.exportAttributes(outfile, level, already_processed, namespace_, name_='basePyNNIaFCell')
         if self.hasContent_():
             outfile.write('>%s' % (eol_, ))
-            self.exportChildren(outfile, level + 1, namespace_, name_, pretty_print=pretty_print)
+            self.exportChildren(outfile, level + 1, namespace_='', name_='basePyNNIaFCell', pretty_print=pretty_print)
             showIndent(outfile, level, pretty_print)
             outfile.write('</%s%s>%s' % (namespace_, name_, eol_))
         else:
             outfile.write('/>%s' % (eol_, ))
     def exportAttributes(self, outfile, level, already_processed, namespace_='', name_='basePyNNIaFCell'):
         super(basePyNNIaFCell, self).exportAttributes(outfile, level, already_processed, namespace_, name_='basePyNNIaFCell')
-        if self.tau_refrac is not None and 'tau_refrac' not in already_processed:
-            already_processed.add('tau_refrac')
-            outfile.write(' tau_refrac="%s"' % self.gds_format_double(self.tau_refrac, input_name='tau_refrac'))
-        if self.v_thresh is not None and 'v_thresh' not in already_processed:
-            already_processed.add('v_thresh')
-            outfile.write(' v_thresh="%s"' % self.gds_format_double(self.v_thresh, input_name='v_thresh'))
         if self.tau_m is not None and 'tau_m' not in already_processed:
             already_processed.add('tau_m')
             outfile.write(' tau_m="%s"' % self.gds_format_double(self.tau_m, input_name='tau_m'))
+        if self.tau_refrac is not None and 'tau_refrac' not in already_processed:
+            already_processed.add('tau_refrac')
+            outfile.write(' tau_refrac="%s"' % self.gds_format_double(self.tau_refrac, input_name='tau_refrac'))
         if self.v_reset is not None and 'v_reset' not in already_processed:
             already_processed.add('v_reset')
             outfile.write(' v_reset="%s"' % self.gds_format_double(self.v_reset, input_name='v_reset'))
         if self.v_rest is not None and 'v_rest' not in already_processed:
             already_processed.add('v_rest')
             outfile.write(' v_rest="%s"' % self.gds_format_double(self.v_rest, input_name='v_rest'))
+        if self.v_thresh is not None and 'v_thresh' not in already_processed:
+            already_processed.add('v_thresh')
+            outfile.write(' v_thresh="%s"' % self.gds_format_double(self.v_thresh, input_name='v_thresh'))
         if self.extensiontype_ is not None and 'xsi:type' not in already_processed:
             already_processed.add('xsi:type')
             outfile.write(' xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"')
             outfile.write(' xsi:type="%s"' % self.extensiontype_)
     def exportChildren(self, outfile, level, namespace_='', name_='basePyNNIaFCell', fromsubclass_=False, pretty_print=True):
         super(basePyNNIaFCell, self).exportChildren(outfile, level, namespace_, name_, True, pretty_print=pretty_print)
-    def exportLiteral(self, outfile, level, name_='basePyNNIaFCell'):
-        level += 1
-        already_processed = set()
-        self.exportLiteralAttributes(outfile, level, already_processed, name_)
-        if self.hasContent_():
-            self.exportLiteralChildren(outfile, level, name_)
-    def exportLiteralAttributes(self, outfile, level, already_processed, name_):
-        if self.tau_refrac is not None and 'tau_refrac' not in already_processed:
-            already_processed.add('tau_refrac')
-            showIndent(outfile, level)
-            outfile.write('tau_refrac=%e,\n' % (self.tau_refrac,))
-        if self.v_thresh is not None and 'v_thresh' not in already_processed:
-            already_processed.add('v_thresh')
-            showIndent(outfile, level)
-            outfile.write('v_thresh=%e,\n' % (self.v_thresh,))
-        if self.tau_m is not None and 'tau_m' not in already_processed:
-            already_processed.add('tau_m')
-            showIndent(outfile, level)
-            outfile.write('tau_m=%e,\n' % (self.tau_m,))
-        if self.v_reset is not None and 'v_reset' not in already_processed:
-            already_processed.add('v_reset')
-            showIndent(outfile, level)
-            outfile.write('v_reset=%e,\n' % (self.v_reset,))
-        if self.v_rest is not None and 'v_rest' not in already_processed:
-            already_processed.add('v_rest')
-            showIndent(outfile, level)
-            outfile.write('v_rest=%e,\n' % (self.v_rest,))
-        super(basePyNNIaFCell, self).exportLiteralAttributes(outfile, level, already_processed, name_)
-    def exportLiteralChildren(self, outfile, level, name_):
-        super(basePyNNIaFCell, self).exportLiteralChildren(outfile, level, name_)
     def build(self, node):
         already_processed = set()
         self.buildAttributes(node, node.attrib, already_processed)
         for child in node:
             nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
             self.buildChildren(child, node, nodeName_)
+        return self
     def buildAttributes(self, node, attrs, already_processed):
-        value = find_attr_value_('tau_refrac', node)
-        if value is not None and 'tau_refrac' not in already_processed:
-            already_processed.add('tau_refrac')
-            try:
-                self.tau_refrac = float(value)
-            except ValueError as exp:
-                raise ValueError('Bad float/double attribute (tau_refrac): %s' % exp)
-        value = find_attr_value_('v_thresh', node)
-        if value is not None and 'v_thresh' not in already_processed:
-            already_processed.add('v_thresh')
-            try:
-                self.v_thresh = float(value)
-            except ValueError as exp:
-                raise ValueError('Bad float/double attribute (v_thresh): %s' % exp)
         value = find_attr_value_('tau_m', node)
         if value is not None and 'tau_m' not in already_processed:
             already_processed.add('tau_m')
@@ -12925,6 +11361,13 @@ class basePyNNIaFCell(basePyNNCell):
                 self.tau_m = float(value)
             except ValueError as exp:
                 raise ValueError('Bad float/double attribute (tau_m): %s' % exp)
+        value = find_attr_value_('tau_refrac', node)
+        if value is not None and 'tau_refrac' not in already_processed:
+            already_processed.add('tau_refrac')
+            try:
+                self.tau_refrac = float(value)
+            except ValueError as exp:
+                raise ValueError('Bad float/double attribute (tau_refrac): %s' % exp)
         value = find_attr_value_('v_reset', node)
         if value is not None and 'v_reset' not in already_processed:
             already_processed.add('v_reset')
@@ -12939,6 +11382,13 @@ class basePyNNIaFCell(basePyNNCell):
                 self.v_rest = float(value)
             except ValueError as exp:
                 raise ValueError('Bad float/double attribute (v_rest): %s' % exp)
+        value = find_attr_value_('v_thresh', node)
+        if value is not None and 'v_thresh' not in already_processed:
+            already_processed.add('v_thresh')
+            try:
+                self.v_thresh = float(value)
+            except ValueError as exp:
+                raise ValueError('Bad float/double attribute (v_thresh): %s' % exp)
         value = find_attr_value_('xsi:type', node)
         if value is not None and 'xsi:type' not in already_processed:
             already_processed.add('xsi:type')
@@ -12953,11 +11403,16 @@ class basePyNNIaFCell(basePyNNCell):
 class IaFRefCell(IaFCell):
     subclass = None
     superclass = IaFCell
-    def __init__(self, id=None, neuroLexId=None, name=None, metaid=None, notes=None, annotation=None, reset=None, C=None, thresh=None, leakConductance=None, leakReversal=None, refract=None):
-        super(IaFRefCell, self).__init__(id, neuroLexId, name, metaid, notes, annotation, reset, C, thresh, leakConductance, leakReversal, )
+    def __init__(self, id=None, neuroLexId=None, name=None, metaid=None, notes=None, annotation=None, leakReversal=None, thresh=None, reset=None, C=None, leakConductance=None, refract=None):
+        self.original_tagname_ = None
+        super(IaFRefCell, self).__init__(id, neuroLexId, name, metaid, notes, annotation, leakReversal, thresh, reset, C, leakConductance, )
         self.refract = _cast(None, refract)
-        pass
     def factory(*args_, **kwargs_):
+        if CurrentSubclassModule_ is not None:
+            subclass = getSubclassFromModule_(
+                CurrentSubclassModule_, IaFRefCell)
+            if subclass is not None:
+                return subclass(*args_, **kwargs_)
         if IaFRefCell.subclass:
             return IaFRefCell.subclass(*args_, **kwargs_)
         else:
@@ -12967,7 +11422,11 @@ class IaFRefCell(IaFCell):
     def set_refract(self, refract): self.refract = refract
     def validate_Nml2Quantity_time(self, value):
         # Validate type Nml2Quantity_time, a restriction on xs:string.
-        pass
+        if value is not None and Validate_simpletypes_:
+            if not self.gds_validate_simple_patterns(
+                    self.validate_Nml2Quantity_time_patterns_, value):
+                warnings_.warn('Value "%s" does not match xsd pattern restrictions: %s' % (value.encode('utf-8'), self.validate_Nml2Quantity_time_patterns_, ))
+    validate_Nml2Quantity_time_patterns_ = [['^-?([0-9]*(\\.[0-9]+)?)([eE]-?[0-9]+)?[\\s]*(s|ms)$']]
     def hasContent_(self):
         if (
             super(IaFRefCell, self).hasContent_()
@@ -12980,13 +11439,15 @@ class IaFRefCell(IaFCell):
             eol_ = '\n'
         else:
             eol_ = ''
+        if self.original_tagname_ is not None:
+            name_ = self.original_tagname_
         showIndent(outfile, level, pretty_print)
         outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
         already_processed = set()
         self.exportAttributes(outfile, level, already_processed, namespace_, name_='IaFRefCell')
         if self.hasContent_():
             outfile.write('>%s' % (eol_, ))
-            self.exportChildren(outfile, level + 1, namespace_, name_, pretty_print=pretty_print)
+            self.exportChildren(outfile, level + 1, namespace_='', name_='IaFRefCell', pretty_print=pretty_print)
             showIndent(outfile, level, pretty_print)
             outfile.write('</%s%s>%s' % (namespace_, name_, eol_))
         else:
@@ -12998,26 +11459,13 @@ class IaFRefCell(IaFCell):
             outfile.write(' refract=%s' % (quote_attrib(self.refract), ))
     def exportChildren(self, outfile, level, namespace_='', name_='IaFRefCell', fromsubclass_=False, pretty_print=True):
         super(IaFRefCell, self).exportChildren(outfile, level, namespace_, name_, True, pretty_print=pretty_print)
-    def exportLiteral(self, outfile, level, name_='IaFRefCell'):
-        level += 1
-        already_processed = set()
-        self.exportLiteralAttributes(outfile, level, already_processed, name_)
-        if self.hasContent_():
-            self.exportLiteralChildren(outfile, level, name_)
-    def exportLiteralAttributes(self, outfile, level, already_processed, name_):
-        if self.refract is not None and 'refract' not in already_processed:
-            already_processed.add('refract')
-            showIndent(outfile, level)
-            outfile.write('refract="%s",\n' % (self.refract,))
-        super(IaFRefCell, self).exportLiteralAttributes(outfile, level, already_processed, name_)
-    def exportLiteralChildren(self, outfile, level, name_):
-        super(IaFRefCell, self).exportLiteralChildren(outfile, level, name_)
     def build(self, node):
         already_processed = set()
         self.buildAttributes(node, node.attrib, already_processed)
         for child in node:
             nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
             self.buildChildren(child, node, nodeName_)
+        return self
     def buildAttributes(self, node, attrs, already_processed):
         value = find_attr_value_('refract', node)
         if value is not None and 'refract' not in already_processed:
@@ -13034,11 +11482,16 @@ class IaFRefCell(IaFCell):
 class IaFTauRefCell(IaFTauCell):
     subclass = None
     superclass = IaFTauCell
-    def __init__(self, id=None, neuroLexId=None, name=None, metaid=None, notes=None, annotation=None, reset=None, tau=None, thresh=None, leakReversal=None, refract=None):
-        super(IaFTauRefCell, self).__init__(id, neuroLexId, name, metaid, notes, annotation, reset, tau, thresh, leakReversal, )
+    def __init__(self, id=None, neuroLexId=None, name=None, metaid=None, notes=None, annotation=None, leakReversal=None, thresh=None, reset=None, tau=None, refract=None):
+        self.original_tagname_ = None
+        super(IaFTauRefCell, self).__init__(id, neuroLexId, name, metaid, notes, annotation, leakReversal, thresh, reset, tau, )
         self.refract = _cast(None, refract)
-        pass
     def factory(*args_, **kwargs_):
+        if CurrentSubclassModule_ is not None:
+            subclass = getSubclassFromModule_(
+                CurrentSubclassModule_, IaFTauRefCell)
+            if subclass is not None:
+                return subclass(*args_, **kwargs_)
         if IaFTauRefCell.subclass:
             return IaFTauRefCell.subclass(*args_, **kwargs_)
         else:
@@ -13048,7 +11501,11 @@ class IaFTauRefCell(IaFTauCell):
     def set_refract(self, refract): self.refract = refract
     def validate_Nml2Quantity_time(self, value):
         # Validate type Nml2Quantity_time, a restriction on xs:string.
-        pass
+        if value is not None and Validate_simpletypes_:
+            if not self.gds_validate_simple_patterns(
+                    self.validate_Nml2Quantity_time_patterns_, value):
+                warnings_.warn('Value "%s" does not match xsd pattern restrictions: %s' % (value.encode('utf-8'), self.validate_Nml2Quantity_time_patterns_, ))
+    validate_Nml2Quantity_time_patterns_ = [['^-?([0-9]*(\\.[0-9]+)?)([eE]-?[0-9]+)?[\\s]*(s|ms)$']]
     def hasContent_(self):
         if (
             super(IaFTauRefCell, self).hasContent_()
@@ -13061,13 +11518,15 @@ class IaFTauRefCell(IaFTauCell):
             eol_ = '\n'
         else:
             eol_ = ''
+        if self.original_tagname_ is not None:
+            name_ = self.original_tagname_
         showIndent(outfile, level, pretty_print)
         outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
         already_processed = set()
         self.exportAttributes(outfile, level, already_processed, namespace_, name_='IaFTauRefCell')
         if self.hasContent_():
             outfile.write('>%s' % (eol_, ))
-            self.exportChildren(outfile, level + 1, namespace_, name_, pretty_print=pretty_print)
+            self.exportChildren(outfile, level + 1, namespace_='', name_='IaFTauRefCell', pretty_print=pretty_print)
             showIndent(outfile, level, pretty_print)
             outfile.write('</%s%s>%s' % (namespace_, name_, eol_))
         else:
@@ -13079,26 +11538,13 @@ class IaFTauRefCell(IaFTauCell):
             outfile.write(' refract=%s' % (quote_attrib(self.refract), ))
     def exportChildren(self, outfile, level, namespace_='', name_='IaFTauRefCell', fromsubclass_=False, pretty_print=True):
         super(IaFTauRefCell, self).exportChildren(outfile, level, namespace_, name_, True, pretty_print=pretty_print)
-    def exportLiteral(self, outfile, level, name_='IaFTauRefCell'):
-        level += 1
-        already_processed = set()
-        self.exportLiteralAttributes(outfile, level, already_processed, name_)
-        if self.hasContent_():
-            self.exportLiteralChildren(outfile, level, name_)
-    def exportLiteralAttributes(self, outfile, level, already_processed, name_):
-        if self.refract is not None and 'refract' not in already_processed:
-            already_processed.add('refract')
-            showIndent(outfile, level)
-            outfile.write('refract="%s",\n' % (self.refract,))
-        super(IaFTauRefCell, self).exportLiteralAttributes(outfile, level, already_processed, name_)
-    def exportLiteralChildren(self, outfile, level, name_):
-        super(IaFTauRefCell, self).exportLiteralChildren(outfile, level, name_)
     def build(self, node):
         already_processed = set()
         self.buildAttributes(node, node.attrib, already_processed)
         for child in node:
             nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
             self.buildChildren(child, node, nodeName_)
+        return self
     def buildAttributes(self, node, attrs, already_processed):
         value = find_attr_value_('refract', node)
         if value is not None and 'refract' not in already_processed:
@@ -13115,12 +11561,18 @@ class IaFTauRefCell(IaFTauCell):
 class ExpTwoSynapse(BaseConductanceBasedSynapse):
     subclass = None
     superclass = BaseConductanceBasedSynapse
-    def __init__(self, id=None, neuroLexId=None, name=None, metaid=None, notes=None, annotation=None, erev=None, gbase=None, tauDecay=None, tauRise=None, extensiontype_=None):
-        super(ExpTwoSynapse, self).__init__(id, neuroLexId, name, metaid, notes, annotation, erev, gbase, extensiontype_, )
+    def __init__(self, id=None, neuroLexId=None, name=None, metaid=None, notes=None, annotation=None, gbase=None, erev=None, tauDecay=None, tauRise=None, extensiontype_=None):
+        self.original_tagname_ = None
+        super(ExpTwoSynapse, self).__init__(id, neuroLexId, name, metaid, notes, annotation, gbase, erev, extensiontype_, )
         self.tauDecay = _cast(None, tauDecay)
         self.tauRise = _cast(None, tauRise)
         self.extensiontype_ = extensiontype_
     def factory(*args_, **kwargs_):
+        if CurrentSubclassModule_ is not None:
+            subclass = getSubclassFromModule_(
+                CurrentSubclassModule_, ExpTwoSynapse)
+            if subclass is not None:
+                return subclass(*args_, **kwargs_)
         if ExpTwoSynapse.subclass:
             return ExpTwoSynapse.subclass(*args_, **kwargs_)
         else:
@@ -13128,13 +11580,17 @@ class ExpTwoSynapse(BaseConductanceBasedSynapse):
     factory = staticmethod(factory)
     def get_tauDecay(self): return self.tauDecay
     def set_tauDecay(self, tauDecay): self.tauDecay = tauDecay
-    def validate_Nml2Quantity_time(self, value):
-        # Validate type Nml2Quantity_time, a restriction on xs:string.
-        pass
     def get_tauRise(self): return self.tauRise
     def set_tauRise(self, tauRise): self.tauRise = tauRise
     def get_extensiontype_(self): return self.extensiontype_
     def set_extensiontype_(self, extensiontype_): self.extensiontype_ = extensiontype_
+    def validate_Nml2Quantity_time(self, value):
+        # Validate type Nml2Quantity_time, a restriction on xs:string.
+        if value is not None and Validate_simpletypes_:
+            if not self.gds_validate_simple_patterns(
+                    self.validate_Nml2Quantity_time_patterns_, value):
+                warnings_.warn('Value "%s" does not match xsd pattern restrictions: %s' % (value.encode('utf-8'), self.validate_Nml2Quantity_time_patterns_, ))
+    validate_Nml2Quantity_time_patterns_ = [['^-?([0-9]*(\\.[0-9]+)?)([eE]-?[0-9]+)?[\\s]*(s|ms)$']]
     def hasContent_(self):
         if (
             super(ExpTwoSynapse, self).hasContent_()
@@ -13147,13 +11603,15 @@ class ExpTwoSynapse(BaseConductanceBasedSynapse):
             eol_ = '\n'
         else:
             eol_ = ''
+        if self.original_tagname_ is not None:
+            name_ = self.original_tagname_
         showIndent(outfile, level, pretty_print)
         outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
         already_processed = set()
         self.exportAttributes(outfile, level, already_processed, namespace_, name_='ExpTwoSynapse')
         if self.hasContent_():
             outfile.write('>%s' % (eol_, ))
-            self.exportChildren(outfile, level + 1, namespace_, name_, pretty_print=pretty_print)
+            self.exportChildren(outfile, level + 1, namespace_='', name_='ExpTwoSynapse', pretty_print=pretty_print)
             showIndent(outfile, level, pretty_print)
             outfile.write('</%s%s>%s' % (namespace_, name_, eol_))
         else:
@@ -13172,30 +11630,13 @@ class ExpTwoSynapse(BaseConductanceBasedSynapse):
             outfile.write(' xsi:type="%s"' % self.extensiontype_)
     def exportChildren(self, outfile, level, namespace_='', name_='ExpTwoSynapse', fromsubclass_=False, pretty_print=True):
         super(ExpTwoSynapse, self).exportChildren(outfile, level, namespace_, name_, True, pretty_print=pretty_print)
-    def exportLiteral(self, outfile, level, name_='ExpTwoSynapse'):
-        level += 1
-        already_processed = set()
-        self.exportLiteralAttributes(outfile, level, already_processed, name_)
-        if self.hasContent_():
-            self.exportLiteralChildren(outfile, level, name_)
-    def exportLiteralAttributes(self, outfile, level, already_processed, name_):
-        if self.tauDecay is not None and 'tauDecay' not in already_processed:
-            already_processed.add('tauDecay')
-            showIndent(outfile, level)
-            outfile.write('tauDecay="%s",\n' % (self.tauDecay,))
-        if self.tauRise is not None and 'tauRise' not in already_processed:
-            already_processed.add('tauRise')
-            showIndent(outfile, level)
-            outfile.write('tauRise="%s",\n' % (self.tauRise,))
-        super(ExpTwoSynapse, self).exportLiteralAttributes(outfile, level, already_processed, name_)
-    def exportLiteralChildren(self, outfile, level, name_):
-        super(ExpTwoSynapse, self).exportLiteralChildren(outfile, level, name_)
     def build(self, node):
         already_processed = set()
         self.buildAttributes(node, node.attrib, already_processed)
         for child in node:
             nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
             self.buildChildren(child, node, nodeName_)
+        return self
     def buildAttributes(self, node, attrs, already_processed):
         value = find_attr_value_('tauDecay', node)
         if value is not None and 'tauDecay' not in already_processed:
@@ -13221,11 +11662,16 @@ class ExpTwoSynapse(BaseConductanceBasedSynapse):
 class ExpOneSynapse(BaseConductanceBasedSynapse):
     subclass = None
     superclass = BaseConductanceBasedSynapse
-    def __init__(self, id=None, neuroLexId=None, name=None, metaid=None, notes=None, annotation=None, erev=None, gbase=None, tauDecay=None):
-        super(ExpOneSynapse, self).__init__(id, neuroLexId, name, metaid, notes, annotation, erev, gbase, )
+    def __init__(self, id=None, neuroLexId=None, name=None, metaid=None, notes=None, annotation=None, gbase=None, erev=None, tauDecay=None):
+        self.original_tagname_ = None
+        super(ExpOneSynapse, self).__init__(id, neuroLexId, name, metaid, notes, annotation, gbase, erev, )
         self.tauDecay = _cast(None, tauDecay)
-        pass
     def factory(*args_, **kwargs_):
+        if CurrentSubclassModule_ is not None:
+            subclass = getSubclassFromModule_(
+                CurrentSubclassModule_, ExpOneSynapse)
+            if subclass is not None:
+                return subclass(*args_, **kwargs_)
         if ExpOneSynapse.subclass:
             return ExpOneSynapse.subclass(*args_, **kwargs_)
         else:
@@ -13235,7 +11681,11 @@ class ExpOneSynapse(BaseConductanceBasedSynapse):
     def set_tauDecay(self, tauDecay): self.tauDecay = tauDecay
     def validate_Nml2Quantity_time(self, value):
         # Validate type Nml2Quantity_time, a restriction on xs:string.
-        pass
+        if value is not None and Validate_simpletypes_:
+            if not self.gds_validate_simple_patterns(
+                    self.validate_Nml2Quantity_time_patterns_, value):
+                warnings_.warn('Value "%s" does not match xsd pattern restrictions: %s' % (value.encode('utf-8'), self.validate_Nml2Quantity_time_patterns_, ))
+    validate_Nml2Quantity_time_patterns_ = [['^-?([0-9]*(\\.[0-9]+)?)([eE]-?[0-9]+)?[\\s]*(s|ms)$']]
     def hasContent_(self):
         if (
             super(ExpOneSynapse, self).hasContent_()
@@ -13248,13 +11698,15 @@ class ExpOneSynapse(BaseConductanceBasedSynapse):
             eol_ = '\n'
         else:
             eol_ = ''
+        if self.original_tagname_ is not None:
+            name_ = self.original_tagname_
         showIndent(outfile, level, pretty_print)
         outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
         already_processed = set()
         self.exportAttributes(outfile, level, already_processed, namespace_, name_='ExpOneSynapse')
         if self.hasContent_():
             outfile.write('>%s' % (eol_, ))
-            self.exportChildren(outfile, level + 1, namespace_, name_, pretty_print=pretty_print)
+            self.exportChildren(outfile, level + 1, namespace_='', name_='ExpOneSynapse', pretty_print=pretty_print)
             showIndent(outfile, level, pretty_print)
             outfile.write('</%s%s>%s' % (namespace_, name_, eol_))
         else:
@@ -13266,26 +11718,13 @@ class ExpOneSynapse(BaseConductanceBasedSynapse):
             outfile.write(' tauDecay=%s' % (quote_attrib(self.tauDecay), ))
     def exportChildren(self, outfile, level, namespace_='', name_='ExpOneSynapse', fromsubclass_=False, pretty_print=True):
         super(ExpOneSynapse, self).exportChildren(outfile, level, namespace_, name_, True, pretty_print=pretty_print)
-    def exportLiteral(self, outfile, level, name_='ExpOneSynapse'):
-        level += 1
-        already_processed = set()
-        self.exportLiteralAttributes(outfile, level, already_processed, name_)
-        if self.hasContent_():
-            self.exportLiteralChildren(outfile, level, name_)
-    def exportLiteralAttributes(self, outfile, level, already_processed, name_):
-        if self.tauDecay is not None and 'tauDecay' not in already_processed:
-            already_processed.add('tauDecay')
-            showIndent(outfile, level)
-            outfile.write('tauDecay="%s",\n' % (self.tauDecay,))
-        super(ExpOneSynapse, self).exportLiteralAttributes(outfile, level, already_processed, name_)
-    def exportLiteralChildren(self, outfile, level, name_):
-        super(ExpOneSynapse, self).exportLiteralChildren(outfile, level, name_)
     def build(self, node):
         already_processed = set()
         self.buildAttributes(node, node.attrib, already_processed)
         for child in node:
             nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
             self.buildChildren(child, node, nodeName_)
+        return self
     def buildAttributes(self, node, attrs, already_processed):
         value = find_attr_value_('tauDecay', node)
         if value is not None and 'tauDecay' not in already_processed:
@@ -13302,10 +11741,15 @@ class ExpOneSynapse(BaseConductanceBasedSynapse):
 class IF_curr_exp(basePyNNIaFCell):
     subclass = None
     superclass = basePyNNIaFCell
-    def __init__(self, id=None, neuroLexId=None, name=None, metaid=None, notes=None, annotation=None, tau_syn_I=None, tau_syn_E=None, i_offset=None, cm=None, v_init=None, tau_refrac=None, v_thresh=None, tau_m=None, v_reset=None, v_rest=None):
-        super(IF_curr_exp, self).__init__(id, neuroLexId, name, metaid, notes, annotation, tau_syn_I, tau_syn_E, i_offset, cm, v_init, tau_refrac, v_thresh, tau_m, v_reset, v_rest, )
-        pass
+    def __init__(self, id=None, neuroLexId=None, name=None, metaid=None, notes=None, annotation=None, cm=None, i_offset=None, tau_syn_E=None, tau_syn_I=None, v_init=None, tau_m=None, tau_refrac=None, v_reset=None, v_rest=None, v_thresh=None):
+        self.original_tagname_ = None
+        super(IF_curr_exp, self).__init__(id, neuroLexId, name, metaid, notes, annotation, cm, i_offset, tau_syn_E, tau_syn_I, v_init, tau_m, tau_refrac, v_reset, v_rest, v_thresh, )
     def factory(*args_, **kwargs_):
+        if CurrentSubclassModule_ is not None:
+            subclass = getSubclassFromModule_(
+                CurrentSubclassModule_, IF_curr_exp)
+            if subclass is not None:
+                return subclass(*args_, **kwargs_)
         if IF_curr_exp.subclass:
             return IF_curr_exp.subclass(*args_, **kwargs_)
         else:
@@ -13323,13 +11767,15 @@ class IF_curr_exp(basePyNNIaFCell):
             eol_ = '\n'
         else:
             eol_ = ''
+        if self.original_tagname_ is not None:
+            name_ = self.original_tagname_
         showIndent(outfile, level, pretty_print)
         outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
         already_processed = set()
         self.exportAttributes(outfile, level, already_processed, namespace_, name_='IF_curr_exp')
         if self.hasContent_():
             outfile.write('>%s' % (eol_, ))
-            self.exportChildren(outfile, level + 1, namespace_, name_, pretty_print=pretty_print)
+            self.exportChildren(outfile, level + 1, namespace_='', name_='IF_curr_exp', pretty_print=pretty_print)
             showIndent(outfile, level, pretty_print)
             outfile.write('</%s%s>%s' % (namespace_, name_, eol_))
         else:
@@ -13338,22 +11784,13 @@ class IF_curr_exp(basePyNNIaFCell):
         super(IF_curr_exp, self).exportAttributes(outfile, level, already_processed, namespace_, name_='IF_curr_exp')
     def exportChildren(self, outfile, level, namespace_='', name_='IF_curr_exp', fromsubclass_=False, pretty_print=True):
         super(IF_curr_exp, self).exportChildren(outfile, level, namespace_, name_, True, pretty_print=pretty_print)
-    def exportLiteral(self, outfile, level, name_='IF_curr_exp'):
-        level += 1
-        already_processed = set()
-        self.exportLiteralAttributes(outfile, level, already_processed, name_)
-        if self.hasContent_():
-            self.exportLiteralChildren(outfile, level, name_)
-    def exportLiteralAttributes(self, outfile, level, already_processed, name_):
-        super(IF_curr_exp, self).exportLiteralAttributes(outfile, level, already_processed, name_)
-    def exportLiteralChildren(self, outfile, level, name_):
-        super(IF_curr_exp, self).exportLiteralChildren(outfile, level, name_)
     def build(self, node):
         already_processed = set()
         self.buildAttributes(node, node.attrib, already_processed)
         for child in node:
             nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
             self.buildChildren(child, node, nodeName_)
+        return self
     def buildAttributes(self, node, attrs, already_processed):
         super(IF_curr_exp, self).buildAttributes(node, attrs, already_processed)
     def buildChildren(self, child_, node, nodeName_, fromsubclass_=False):
@@ -13365,10 +11802,15 @@ class IF_curr_exp(basePyNNIaFCell):
 class IF_curr_alpha(basePyNNIaFCell):
     subclass = None
     superclass = basePyNNIaFCell
-    def __init__(self, id=None, neuroLexId=None, name=None, metaid=None, notes=None, annotation=None, tau_syn_I=None, tau_syn_E=None, i_offset=None, cm=None, v_init=None, tau_refrac=None, v_thresh=None, tau_m=None, v_reset=None, v_rest=None):
-        super(IF_curr_alpha, self).__init__(id, neuroLexId, name, metaid, notes, annotation, tau_syn_I, tau_syn_E, i_offset, cm, v_init, tau_refrac, v_thresh, tau_m, v_reset, v_rest, )
-        pass
+    def __init__(self, id=None, neuroLexId=None, name=None, metaid=None, notes=None, annotation=None, cm=None, i_offset=None, tau_syn_E=None, tau_syn_I=None, v_init=None, tau_m=None, tau_refrac=None, v_reset=None, v_rest=None, v_thresh=None):
+        self.original_tagname_ = None
+        super(IF_curr_alpha, self).__init__(id, neuroLexId, name, metaid, notes, annotation, cm, i_offset, tau_syn_E, tau_syn_I, v_init, tau_m, tau_refrac, v_reset, v_rest, v_thresh, )
     def factory(*args_, **kwargs_):
+        if CurrentSubclassModule_ is not None:
+            subclass = getSubclassFromModule_(
+                CurrentSubclassModule_, IF_curr_alpha)
+            if subclass is not None:
+                return subclass(*args_, **kwargs_)
         if IF_curr_alpha.subclass:
             return IF_curr_alpha.subclass(*args_, **kwargs_)
         else:
@@ -13386,13 +11828,15 @@ class IF_curr_alpha(basePyNNIaFCell):
             eol_ = '\n'
         else:
             eol_ = ''
+        if self.original_tagname_ is not None:
+            name_ = self.original_tagname_
         showIndent(outfile, level, pretty_print)
         outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
         already_processed = set()
         self.exportAttributes(outfile, level, already_processed, namespace_, name_='IF_curr_alpha')
         if self.hasContent_():
             outfile.write('>%s' % (eol_, ))
-            self.exportChildren(outfile, level + 1, namespace_, name_, pretty_print=pretty_print)
+            self.exportChildren(outfile, level + 1, namespace_='', name_='IF_curr_alpha', pretty_print=pretty_print)
             showIndent(outfile, level, pretty_print)
             outfile.write('</%s%s>%s' % (namespace_, name_, eol_))
         else:
@@ -13401,22 +11845,13 @@ class IF_curr_alpha(basePyNNIaFCell):
         super(IF_curr_alpha, self).exportAttributes(outfile, level, already_processed, namespace_, name_='IF_curr_alpha')
     def exportChildren(self, outfile, level, namespace_='', name_='IF_curr_alpha', fromsubclass_=False, pretty_print=True):
         super(IF_curr_alpha, self).exportChildren(outfile, level, namespace_, name_, True, pretty_print=pretty_print)
-    def exportLiteral(self, outfile, level, name_='IF_curr_alpha'):
-        level += 1
-        already_processed = set()
-        self.exportLiteralAttributes(outfile, level, already_processed, name_)
-        if self.hasContent_():
-            self.exportLiteralChildren(outfile, level, name_)
-    def exportLiteralAttributes(self, outfile, level, already_processed, name_):
-        super(IF_curr_alpha, self).exportLiteralAttributes(outfile, level, already_processed, name_)
-    def exportLiteralChildren(self, outfile, level, name_):
-        super(IF_curr_alpha, self).exportLiteralChildren(outfile, level, name_)
     def build(self, node):
         already_processed = set()
         self.buildAttributes(node, node.attrib, already_processed)
         for child in node:
             nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
             self.buildChildren(child, node, nodeName_)
+        return self
     def buildAttributes(self, node, attrs, already_processed):
         super(IF_curr_alpha, self).buildAttributes(node, attrs, already_processed)
     def buildChildren(self, child_, node, nodeName_, fromsubclass_=False):
@@ -13428,21 +11863,27 @@ class IF_curr_alpha(basePyNNIaFCell):
 class basePyNNIaFCondCell(basePyNNIaFCell):
     subclass = None
     superclass = basePyNNIaFCell
-    def __init__(self, id=None, neuroLexId=None, name=None, metaid=None, notes=None, annotation=None, tau_syn_I=None, tau_syn_E=None, i_offset=None, cm=None, v_init=None, tau_refrac=None, v_thresh=None, tau_m=None, v_reset=None, v_rest=None, e_rev_I=None, e_rev_E=None, extensiontype_=None):
-        super(basePyNNIaFCondCell, self).__init__(id, neuroLexId, name, metaid, notes, annotation, tau_syn_I, tau_syn_E, i_offset, cm, v_init, tau_refrac, v_thresh, tau_m, v_reset, v_rest, extensiontype_, )
-        self.e_rev_I = _cast(float, e_rev_I)
+    def __init__(self, id=None, neuroLexId=None, name=None, metaid=None, notes=None, annotation=None, cm=None, i_offset=None, tau_syn_E=None, tau_syn_I=None, v_init=None, tau_m=None, tau_refrac=None, v_reset=None, v_rest=None, v_thresh=None, e_rev_E=None, e_rev_I=None, extensiontype_=None):
+        self.original_tagname_ = None
+        super(basePyNNIaFCondCell, self).__init__(id, neuroLexId, name, metaid, notes, annotation, cm, i_offset, tau_syn_E, tau_syn_I, v_init, tau_m, tau_refrac, v_reset, v_rest, v_thresh, extensiontype_, )
         self.e_rev_E = _cast(float, e_rev_E)
+        self.e_rev_I = _cast(float, e_rev_I)
         self.extensiontype_ = extensiontype_
     def factory(*args_, **kwargs_):
+        if CurrentSubclassModule_ is not None:
+            subclass = getSubclassFromModule_(
+                CurrentSubclassModule_, basePyNNIaFCondCell)
+            if subclass is not None:
+                return subclass(*args_, **kwargs_)
         if basePyNNIaFCondCell.subclass:
             return basePyNNIaFCondCell.subclass(*args_, **kwargs_)
         else:
             return basePyNNIaFCondCell(*args_, **kwargs_)
     factory = staticmethod(factory)
-    def get_e_rev_I(self): return self.e_rev_I
-    def set_e_rev_I(self, e_rev_I): self.e_rev_I = e_rev_I
     def get_e_rev_E(self): return self.e_rev_E
     def set_e_rev_E(self, e_rev_E): self.e_rev_E = e_rev_E
+    def get_e_rev_I(self): return self.e_rev_I
+    def set_e_rev_I(self, e_rev_I): self.e_rev_I = e_rev_I
     def get_extensiontype_(self): return self.extensiontype_
     def set_extensiontype_(self, extensiontype_): self.extensiontype_ = extensiontype_
     def hasContent_(self):
@@ -13457,63 +11898,41 @@ class basePyNNIaFCondCell(basePyNNIaFCell):
             eol_ = '\n'
         else:
             eol_ = ''
+        if self.original_tagname_ is not None:
+            name_ = self.original_tagname_
         showIndent(outfile, level, pretty_print)
         outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
         already_processed = set()
         self.exportAttributes(outfile, level, already_processed, namespace_, name_='basePyNNIaFCondCell')
         if self.hasContent_():
             outfile.write('>%s' % (eol_, ))
-            self.exportChildren(outfile, level + 1, namespace_, name_, pretty_print=pretty_print)
+            self.exportChildren(outfile, level + 1, namespace_='', name_='basePyNNIaFCondCell', pretty_print=pretty_print)
             showIndent(outfile, level, pretty_print)
             outfile.write('</%s%s>%s' % (namespace_, name_, eol_))
         else:
             outfile.write('/>%s' % (eol_, ))
     def exportAttributes(self, outfile, level, already_processed, namespace_='', name_='basePyNNIaFCondCell'):
         super(basePyNNIaFCondCell, self).exportAttributes(outfile, level, already_processed, namespace_, name_='basePyNNIaFCondCell')
-        if self.e_rev_I is not None and 'e_rev_I' not in already_processed:
-            already_processed.add('e_rev_I')
-            outfile.write(' e_rev_I="%s"' % self.gds_format_double(self.e_rev_I, input_name='e_rev_I'))
         if self.e_rev_E is not None and 'e_rev_E' not in already_processed:
             already_processed.add('e_rev_E')
             outfile.write(' e_rev_E="%s"' % self.gds_format_double(self.e_rev_E, input_name='e_rev_E'))
+        if self.e_rev_I is not None and 'e_rev_I' not in already_processed:
+            already_processed.add('e_rev_I')
+            outfile.write(' e_rev_I="%s"' % self.gds_format_double(self.e_rev_I, input_name='e_rev_I'))
         if self.extensiontype_ is not None and 'xsi:type' not in already_processed:
             already_processed.add('xsi:type')
             outfile.write(' xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"')
             outfile.write(' xsi:type="%s"' % self.extensiontype_)
     def exportChildren(self, outfile, level, namespace_='', name_='basePyNNIaFCondCell', fromsubclass_=False, pretty_print=True):
         super(basePyNNIaFCondCell, self).exportChildren(outfile, level, namespace_, name_, True, pretty_print=pretty_print)
-    def exportLiteral(self, outfile, level, name_='basePyNNIaFCondCell'):
-        level += 1
-        already_processed = set()
-        self.exportLiteralAttributes(outfile, level, already_processed, name_)
-        if self.hasContent_():
-            self.exportLiteralChildren(outfile, level, name_)
-    def exportLiteralAttributes(self, outfile, level, already_processed, name_):
-        if self.e_rev_I is not None and 'e_rev_I' not in already_processed:
-            already_processed.add('e_rev_I')
-            showIndent(outfile, level)
-            outfile.write('e_rev_I=%e,\n' % (self.e_rev_I,))
-        if self.e_rev_E is not None and 'e_rev_E' not in already_processed:
-            already_processed.add('e_rev_E')
-            showIndent(outfile, level)
-            outfile.write('e_rev_E=%e,\n' % (self.e_rev_E,))
-        super(basePyNNIaFCondCell, self).exportLiteralAttributes(outfile, level, already_processed, name_)
-    def exportLiteralChildren(self, outfile, level, name_):
-        super(basePyNNIaFCondCell, self).exportLiteralChildren(outfile, level, name_)
     def build(self, node):
         already_processed = set()
         self.buildAttributes(node, node.attrib, already_processed)
         for child in node:
             nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
             self.buildChildren(child, node, nodeName_)
+        return self
     def buildAttributes(self, node, attrs, already_processed):
-        value = find_attr_value_('e_rev_I', node)
-        if value is not None and 'e_rev_I' not in already_processed:
-            already_processed.add('e_rev_I')
-            try:
-                self.e_rev_I = float(value)
-            except ValueError as exp:
-                raise ValueError('Bad float/double attribute (e_rev_I): %s' % exp)
         value = find_attr_value_('e_rev_E', node)
         if value is not None and 'e_rev_E' not in already_processed:
             already_processed.add('e_rev_E')
@@ -13521,6 +11940,13 @@ class basePyNNIaFCondCell(basePyNNIaFCell):
                 self.e_rev_E = float(value)
             except ValueError as exp:
                 raise ValueError('Bad float/double attribute (e_rev_E): %s' % exp)
+        value = find_attr_value_('e_rev_I', node)
+        if value is not None and 'e_rev_I' not in already_processed:
+            already_processed.add('e_rev_I')
+            try:
+                self.e_rev_I = float(value)
+            except ValueError as exp:
+                raise ValueError('Bad float/double attribute (e_rev_I): %s' % exp)
         value = find_attr_value_('xsi:type', node)
         if value is not None and 'xsi:type' not in already_processed:
             already_processed.add('xsi:type')
@@ -13535,11 +11961,17 @@ class basePyNNIaFCondCell(basePyNNIaFCell):
 class BlockingPlasticSynapse(ExpTwoSynapse):
     subclass = None
     superclass = ExpTwoSynapse
-    def __init__(self, id=None, neuroLexId=None, name=None, metaid=None, notes=None, annotation=None, erev=None, gbase=None, tauDecay=None, tauRise=None, plasticityMechanism=None, blockMechanism=None):
-        super(BlockingPlasticSynapse, self).__init__(id, neuroLexId, name, metaid, notes, annotation, erev, gbase, tauDecay, tauRise, )
+    def __init__(self, id=None, neuroLexId=None, name=None, metaid=None, notes=None, annotation=None, gbase=None, erev=None, tauDecay=None, tauRise=None, plasticityMechanism=None, blockMechanism=None):
+        self.original_tagname_ = None
+        super(BlockingPlasticSynapse, self).__init__(id, neuroLexId, name, metaid, notes, annotation, gbase, erev, tauDecay, tauRise, )
         self.plasticityMechanism = plasticityMechanism
         self.blockMechanism = blockMechanism
     def factory(*args_, **kwargs_):
+        if CurrentSubclassModule_ is not None:
+            subclass = getSubclassFromModule_(
+                CurrentSubclassModule_, BlockingPlasticSynapse)
+            if subclass is not None:
+                return subclass(*args_, **kwargs_)
         if BlockingPlasticSynapse.subclass:
             return BlockingPlasticSynapse.subclass(*args_, **kwargs_)
         else:
@@ -13563,13 +11995,15 @@ class BlockingPlasticSynapse(ExpTwoSynapse):
             eol_ = '\n'
         else:
             eol_ = ''
+        if self.original_tagname_ is not None:
+            name_ = self.original_tagname_
         showIndent(outfile, level, pretty_print)
         outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
         already_processed = set()
         self.exportAttributes(outfile, level, already_processed, namespace_, name_='BlockingPlasticSynapse')
         if self.hasContent_():
             outfile.write('>%s' % (eol_, ))
-            self.exportChildren(outfile, level + 1, namespace_, name_, pretty_print=pretty_print)
+            self.exportChildren(outfile, level + 1, namespace_='', name_='BlockingPlasticSynapse', pretty_print=pretty_print)
             showIndent(outfile, level, pretty_print)
             outfile.write('</%s%s>%s' % (namespace_, name_, eol_))
         else:
@@ -13586,45 +12020,26 @@ class BlockingPlasticSynapse(ExpTwoSynapse):
             self.plasticityMechanism.export(outfile, level, namespace_, name_='plasticityMechanism', pretty_print=pretty_print)
         if self.blockMechanism is not None:
             self.blockMechanism.export(outfile, level, namespace_, name_='blockMechanism', pretty_print=pretty_print)
-    def exportLiteral(self, outfile, level, name_='BlockingPlasticSynapse'):
-        level += 1
-        already_processed = set()
-        self.exportLiteralAttributes(outfile, level, already_processed, name_)
-        if self.hasContent_():
-            self.exportLiteralChildren(outfile, level, name_)
-    def exportLiteralAttributes(self, outfile, level, already_processed, name_):
-        super(BlockingPlasticSynapse, self).exportLiteralAttributes(outfile, level, already_processed, name_)
-    def exportLiteralChildren(self, outfile, level, name_):
-        super(BlockingPlasticSynapse, self).exportLiteralChildren(outfile, level, name_)
-        if self.plasticityMechanism is not None:
-            showIndent(outfile, level)
-            outfile.write('plasticityMechanism=model_.PlasticityMechanism(\n')
-            self.plasticityMechanism.exportLiteral(outfile, level, name_='plasticityMechanism')
-            showIndent(outfile, level)
-            outfile.write('),\n')
-        if self.blockMechanism is not None:
-            showIndent(outfile, level)
-            outfile.write('blockMechanism=model_.BlockMechanism(\n')
-            self.blockMechanism.exportLiteral(outfile, level, name_='blockMechanism')
-            showIndent(outfile, level)
-            outfile.write('),\n')
     def build(self, node):
         already_processed = set()
         self.buildAttributes(node, node.attrib, already_processed)
         for child in node:
             nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
             self.buildChildren(child, node, nodeName_)
+        return self
     def buildAttributes(self, node, attrs, already_processed):
         super(BlockingPlasticSynapse, self).buildAttributes(node, attrs, already_processed)
     def buildChildren(self, child_, node, nodeName_, fromsubclass_=False):
         if nodeName_ == 'plasticityMechanism':
             obj_ = PlasticityMechanism.factory()
             obj_.build(child_)
-            self.set_plasticityMechanism(obj_)
+            self.plasticityMechanism = obj_
+            obj_.original_tagname_ = 'plasticityMechanism'
         elif nodeName_ == 'blockMechanism':
             obj_ = BlockMechanism.factory()
             obj_.build(child_)
-            self.set_blockMechanism(obj_)
+            self.blockMechanism = obj_
+            obj_.original_tagname_ = 'blockMechanism'
         super(BlockingPlasticSynapse, self).buildChildren(child_, node, nodeName_, True)
 # end class BlockingPlasticSynapse
 
@@ -13632,15 +12047,20 @@ class BlockingPlasticSynapse(ExpTwoSynapse):
 class EIF_cond_alpha_isfa_ista(basePyNNIaFCondCell):
     subclass = None
     superclass = basePyNNIaFCondCell
-    def __init__(self, id=None, neuroLexId=None, name=None, metaid=None, notes=None, annotation=None, tau_syn_I=None, tau_syn_E=None, i_offset=None, cm=None, v_init=None, tau_refrac=None, v_thresh=None, tau_m=None, v_reset=None, v_rest=None, e_rev_I=None, e_rev_E=None, a=None, delta_T=None, b=None, v_spike=None, tau_w=None):
-        super(EIF_cond_alpha_isfa_ista, self).__init__(id, neuroLexId, name, metaid, notes, annotation, tau_syn_I, tau_syn_E, i_offset, cm, v_init, tau_refrac, v_thresh, tau_m, v_reset, v_rest, e_rev_I, e_rev_E, )
+    def __init__(self, id=None, neuroLexId=None, name=None, metaid=None, notes=None, annotation=None, cm=None, i_offset=None, tau_syn_E=None, tau_syn_I=None, v_init=None, tau_m=None, tau_refrac=None, v_reset=None, v_rest=None, v_thresh=None, e_rev_E=None, e_rev_I=None, a=None, b=None, delta_T=None, tau_w=None, v_spike=None):
+        self.original_tagname_ = None
+        super(EIF_cond_alpha_isfa_ista, self).__init__(id, neuroLexId, name, metaid, notes, annotation, cm, i_offset, tau_syn_E, tau_syn_I, v_init, tau_m, tau_refrac, v_reset, v_rest, v_thresh, e_rev_E, e_rev_I, )
         self.a = _cast(float, a)
-        self.delta_T = _cast(float, delta_T)
         self.b = _cast(float, b)
-        self.v_spike = _cast(float, v_spike)
+        self.delta_T = _cast(float, delta_T)
         self.tau_w = _cast(float, tau_w)
-        pass
+        self.v_spike = _cast(float, v_spike)
     def factory(*args_, **kwargs_):
+        if CurrentSubclassModule_ is not None:
+            subclass = getSubclassFromModule_(
+                CurrentSubclassModule_, EIF_cond_alpha_isfa_ista)
+            if subclass is not None:
+                return subclass(*args_, **kwargs_)
         if EIF_cond_alpha_isfa_ista.subclass:
             return EIF_cond_alpha_isfa_ista.subclass(*args_, **kwargs_)
         else:
@@ -13648,14 +12068,14 @@ class EIF_cond_alpha_isfa_ista(basePyNNIaFCondCell):
     factory = staticmethod(factory)
     def get_a(self): return self.a
     def set_a(self, a): self.a = a
-    def get_delta_T(self): return self.delta_T
-    def set_delta_T(self, delta_T): self.delta_T = delta_T
     def get_b(self): return self.b
     def set_b(self, b): self.b = b
-    def get_v_spike(self): return self.v_spike
-    def set_v_spike(self, v_spike): self.v_spike = v_spike
+    def get_delta_T(self): return self.delta_T
+    def set_delta_T(self, delta_T): self.delta_T = delta_T
     def get_tau_w(self): return self.tau_w
     def set_tau_w(self, tau_w): self.tau_w = tau_w
+    def get_v_spike(self): return self.v_spike
+    def set_v_spike(self, v_spike): self.v_spike = v_spike
     def hasContent_(self):
         if (
             super(EIF_cond_alpha_isfa_ista, self).hasContent_()
@@ -13668,13 +12088,15 @@ class EIF_cond_alpha_isfa_ista(basePyNNIaFCondCell):
             eol_ = '\n'
         else:
             eol_ = ''
+        if self.original_tagname_ is not None:
+            name_ = self.original_tagname_
         showIndent(outfile, level, pretty_print)
         outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
         already_processed = set()
         self.exportAttributes(outfile, level, already_processed, namespace_, name_='EIF_cond_alpha_isfa_ista')
         if self.hasContent_():
             outfile.write('>%s' % (eol_, ))
-            self.exportChildren(outfile, level + 1, namespace_, name_, pretty_print=pretty_print)
+            self.exportChildren(outfile, level + 1, namespace_='', name_='EIF_cond_alpha_isfa_ista', pretty_print=pretty_print)
             showIndent(outfile, level, pretty_print)
             outfile.write('</%s%s>%s' % (namespace_, name_, eol_))
         else:
@@ -13683,57 +12105,28 @@ class EIF_cond_alpha_isfa_ista(basePyNNIaFCondCell):
         super(EIF_cond_alpha_isfa_ista, self).exportAttributes(outfile, level, already_processed, namespace_, name_='EIF_cond_alpha_isfa_ista')
         if self.a is not None and 'a' not in already_processed:
             already_processed.add('a')
-            outfile.write(' a="%s"' % self.gds_format_double(self.a, input_name='a'))
-        if self.delta_T is not None and 'delta_T' not in already_processed:
-            already_processed.add('delta_T')
-            outfile.write(' delta_T="%s"' % self.gds_format_double(self.delta_T, input_name='delta_T'))
-        if self.b is not None and 'b' not in already_processed:
-            already_processed.add('b')
-            outfile.write(' b="%s"' % self.gds_format_double(self.b, input_name='b'))
-        if self.v_spike is not None and 'v_spike' not in already_processed:
-            already_processed.add('v_spike')
-            outfile.write(' v_spike="%s"' % self.gds_format_double(self.v_spike, input_name='v_spike'))
-        if self.tau_w is not None and 'tau_w' not in already_processed:
-            already_processed.add('tau_w')
-            outfile.write(' tau_w="%s"' % self.gds_format_double(self.tau_w, input_name='tau_w'))
-    def exportChildren(self, outfile, level, namespace_='', name_='EIF_cond_alpha_isfa_ista', fromsubclass_=False, pretty_print=True):
-        super(EIF_cond_alpha_isfa_ista, self).exportChildren(outfile, level, namespace_, name_, True, pretty_print=pretty_print)
-    def exportLiteral(self, outfile, level, name_='EIF_cond_alpha_isfa_ista'):
-        level += 1
-        already_processed = set()
-        self.exportLiteralAttributes(outfile, level, already_processed, name_)
-        if self.hasContent_():
-            self.exportLiteralChildren(outfile, level, name_)
-    def exportLiteralAttributes(self, outfile, level, already_processed, name_):
-        if self.a is not None and 'a' not in already_processed:
-            already_processed.add('a')
-            showIndent(outfile, level)
-            outfile.write('a=%e,\n' % (self.a,))
-        if self.delta_T is not None and 'delta_T' not in already_processed:
-            already_processed.add('delta_T')
-            showIndent(outfile, level)
-            outfile.write('delta_T=%e,\n' % (self.delta_T,))
+            outfile.write(' a="%s"' % self.gds_format_double(self.a, input_name='a'))
         if self.b is not None and 'b' not in already_processed:
             already_processed.add('b')
-            showIndent(outfile, level)
-            outfile.write('b=%e,\n' % (self.b,))
-        if self.v_spike is not None and 'v_spike' not in already_processed:
-            already_processed.add('v_spike')
-            showIndent(outfile, level)
-            outfile.write('v_spike=%e,\n' % (self.v_spike,))
+            outfile.write(' b="%s"' % self.gds_format_double(self.b, input_name='b'))
+        if self.delta_T is not None and 'delta_T' not in already_processed:
+            already_processed.add('delta_T')
+            outfile.write(' delta_T="%s"' % self.gds_format_double(self.delta_T, input_name='delta_T'))
         if self.tau_w is not None and 'tau_w' not in already_processed:
             already_processed.add('tau_w')
-            showIndent(outfile, level)
-            outfile.write('tau_w=%e,\n' % (self.tau_w,))
-        super(EIF_cond_alpha_isfa_ista, self).exportLiteralAttributes(outfile, level, already_processed, name_)
-    def exportLiteralChildren(self, outfile, level, name_):
-        super(EIF_cond_alpha_isfa_ista, self).exportLiteralChildren(outfile, level, name_)
+            outfile.write(' tau_w="%s"' % self.gds_format_double(self.tau_w, input_name='tau_w'))
+        if self.v_spike is not None and 'v_spike' not in already_processed:
+            already_processed.add('v_spike')
+            outfile.write(' v_spike="%s"' % self.gds_format_double(self.v_spike, input_name='v_spike'))
+    def exportChildren(self, outfile, level, namespace_='', name_='EIF_cond_alpha_isfa_ista', fromsubclass_=False, pretty_print=True):
+        super(EIF_cond_alpha_isfa_ista, self).exportChildren(outfile, level, namespace_, name_, True, pretty_print=pretty_print)
     def build(self, node):
         already_processed = set()
         self.buildAttributes(node, node.attrib, already_processed)
         for child in node:
             nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
             self.buildChildren(child, node, nodeName_)
+        return self
     def buildAttributes(self, node, attrs, already_processed):
         value = find_attr_value_('a', node)
         if value is not None and 'a' not in already_processed:
@@ -13742,13 +12135,6 @@ class EIF_cond_alpha_isfa_ista(basePyNNIaFCondCell):
                 self.a = float(value)
             except ValueError as exp:
                 raise ValueError('Bad float/double attribute (a): %s' % exp)
-        value = find_attr_value_('delta_T', node)
-        if value is not None and 'delta_T' not in already_processed:
-            already_processed.add('delta_T')
-            try:
-                self.delta_T = float(value)
-            except ValueError as exp:
-                raise ValueError('Bad float/double attribute (delta_T): %s' % exp)
         value = find_attr_value_('b', node)
         if value is not None and 'b' not in already_processed:
             already_processed.add('b')
@@ -13756,13 +12142,13 @@ class EIF_cond_alpha_isfa_ista(basePyNNIaFCondCell):
                 self.b = float(value)
             except ValueError as exp:
                 raise ValueError('Bad float/double attribute (b): %s' % exp)
-        value = find_attr_value_('v_spike', node)
-        if value is not None and 'v_spike' not in already_processed:
-            already_processed.add('v_spike')
+        value = find_attr_value_('delta_T', node)
+        if value is not None and 'delta_T' not in already_processed:
+            already_processed.add('delta_T')
             try:
-                self.v_spike = float(value)
+                self.delta_T = float(value)
             except ValueError as exp:
-                raise ValueError('Bad float/double attribute (v_spike): %s' % exp)
+                raise ValueError('Bad float/double attribute (delta_T): %s' % exp)
         value = find_attr_value_('tau_w', node)
         if value is not None and 'tau_w' not in already_processed:
             already_processed.add('tau_w')
@@ -13770,6 +12156,13 @@ class EIF_cond_alpha_isfa_ista(basePyNNIaFCondCell):
                 self.tau_w = float(value)
             except ValueError as exp:
                 raise ValueError('Bad float/double attribute (tau_w): %s' % exp)
+        value = find_attr_value_('v_spike', node)
+        if value is not None and 'v_spike' not in already_processed:
+            already_processed.add('v_spike')
+            try:
+                self.v_spike = float(value)
+            except ValueError as exp:
+                raise ValueError('Bad float/double attribute (v_spike): %s' % exp)
         super(EIF_cond_alpha_isfa_ista, self).buildAttributes(node, attrs, already_processed)
     def buildChildren(self, child_, node, nodeName_, fromsubclass_=False):
         super(EIF_cond_alpha_isfa_ista, self).buildChildren(child_, node, nodeName_, True)
@@ -13780,15 +12173,20 @@ class EIF_cond_alpha_isfa_ista(basePyNNIaFCondCell):
 class EIF_cond_exp_isfa_ista(basePyNNIaFCondCell):
     subclass = None
     superclass = basePyNNIaFCondCell
-    def __init__(self, id=None, neuroLexId=None, name=None, metaid=None, notes=None, annotation=None, tau_syn_I=None, tau_syn_E=None, i_offset=None, cm=None, v_init=None, tau_refrac=None, v_thresh=None, tau_m=None, v_reset=None, v_rest=None, e_rev_I=None, e_rev_E=None, a=None, delta_T=None, b=None, v_spike=None, tau_w=None):
-        super(EIF_cond_exp_isfa_ista, self).__init__(id, neuroLexId, name, metaid, notes, annotation, tau_syn_I, tau_syn_E, i_offset, cm, v_init, tau_refrac, v_thresh, tau_m, v_reset, v_rest, e_rev_I, e_rev_E, )
+    def __init__(self, id=None, neuroLexId=None, name=None, metaid=None, notes=None, annotation=None, cm=None, i_offset=None, tau_syn_E=None, tau_syn_I=None, v_init=None, tau_m=None, tau_refrac=None, v_reset=None, v_rest=None, v_thresh=None, e_rev_E=None, e_rev_I=None, a=None, b=None, delta_T=None, tau_w=None, v_spike=None):
+        self.original_tagname_ = None
+        super(EIF_cond_exp_isfa_ista, self).__init__(id, neuroLexId, name, metaid, notes, annotation, cm, i_offset, tau_syn_E, tau_syn_I, v_init, tau_m, tau_refrac, v_reset, v_rest, v_thresh, e_rev_E, e_rev_I, )
         self.a = _cast(float, a)
-        self.delta_T = _cast(float, delta_T)
         self.b = _cast(float, b)
-        self.v_spike = _cast(float, v_spike)
+        self.delta_T = _cast(float, delta_T)
         self.tau_w = _cast(float, tau_w)
-        pass
+        self.v_spike = _cast(float, v_spike)
     def factory(*args_, **kwargs_):
+        if CurrentSubclassModule_ is not None:
+            subclass = getSubclassFromModule_(
+                CurrentSubclassModule_, EIF_cond_exp_isfa_ista)
+            if subclass is not None:
+                return subclass(*args_, **kwargs_)
         if EIF_cond_exp_isfa_ista.subclass:
             return EIF_cond_exp_isfa_ista.subclass(*args_, **kwargs_)
         else:
@@ -13796,14 +12194,14 @@ class EIF_cond_exp_isfa_ista(basePyNNIaFCondCell):
     factory = staticmethod(factory)
     def get_a(self): return self.a
     def set_a(self, a): self.a = a
-    def get_delta_T(self): return self.delta_T
-    def set_delta_T(self, delta_T): self.delta_T = delta_T
     def get_b(self): return self.b
     def set_b(self, b): self.b = b
-    def get_v_spike(self): return self.v_spike
-    def set_v_spike(self, v_spike): self.v_spike = v_spike
+    def get_delta_T(self): return self.delta_T
+    def set_delta_T(self, delta_T): self.delta_T = delta_T
     def get_tau_w(self): return self.tau_w
     def set_tau_w(self, tau_w): self.tau_w = tau_w
+    def get_v_spike(self): return self.v_spike
+    def set_v_spike(self, v_spike): self.v_spike = v_spike
     def hasContent_(self):
         if (
             super(EIF_cond_exp_isfa_ista, self).hasContent_()
@@ -13816,13 +12214,15 @@ class EIF_cond_exp_isfa_ista(basePyNNIaFCondCell):
             eol_ = '\n'
         else:
             eol_ = ''
+        if self.original_tagname_ is not None:
+            name_ = self.original_tagname_
         showIndent(outfile, level, pretty_print)
         outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
         already_processed = set()
         self.exportAttributes(outfile, level, already_processed, namespace_, name_='EIF_cond_exp_isfa_ista')
         if self.hasContent_():
             outfile.write('>%s' % (eol_, ))
-            self.exportChildren(outfile, level + 1, namespace_, name_, pretty_print=pretty_print)
+            self.exportChildren(outfile, level + 1, namespace_='', name_='EIF_cond_exp_isfa_ista', pretty_print=pretty_print)
             showIndent(outfile, level, pretty_print)
             outfile.write('</%s%s>%s' % (namespace_, name_, eol_))
         else:
@@ -13832,56 +12232,27 @@ class EIF_cond_exp_isfa_ista(basePyNNIaFCondCell):
         if self.a is not None and 'a' not in already_processed:
             already_processed.add('a')
             outfile.write(' a="%s"' % self.gds_format_double(self.a, input_name='a'))
-        if self.delta_T is not None and 'delta_T' not in already_processed:
-            already_processed.add('delta_T')
-            outfile.write(' delta_T="%s"' % self.gds_format_double(self.delta_T, input_name='delta_T'))
         if self.b is not None and 'b' not in already_processed:
             already_processed.add('b')
             outfile.write(' b="%s"' % self.gds_format_double(self.b, input_name='b'))
-        if self.v_spike is not None and 'v_spike' not in already_processed:
-            already_processed.add('v_spike')
-            outfile.write(' v_spike="%s"' % self.gds_format_double(self.v_spike, input_name='v_spike'))
+        if self.delta_T is not None and 'delta_T' not in already_processed:
+            already_processed.add('delta_T')
+            outfile.write(' delta_T="%s"' % self.gds_format_double(self.delta_T, input_name='delta_T'))
         if self.tau_w is not None and 'tau_w' not in already_processed:
             already_processed.add('tau_w')
             outfile.write(' tau_w="%s"' % self.gds_format_double(self.tau_w, input_name='tau_w'))
-    def exportChildren(self, outfile, level, namespace_='', name_='EIF_cond_exp_isfa_ista', fromsubclass_=False, pretty_print=True):
-        super(EIF_cond_exp_isfa_ista, self).exportChildren(outfile, level, namespace_, name_, True, pretty_print=pretty_print)
-    def exportLiteral(self, outfile, level, name_='EIF_cond_exp_isfa_ista'):
-        level += 1
-        already_processed = set()
-        self.exportLiteralAttributes(outfile, level, already_processed, name_)
-        if self.hasContent_():
-            self.exportLiteralChildren(outfile, level, name_)
-    def exportLiteralAttributes(self, outfile, level, already_processed, name_):
-        if self.a is not None and 'a' not in already_processed:
-            already_processed.add('a')
-            showIndent(outfile, level)
-            outfile.write('a=%e,\n' % (self.a,))
-        if self.delta_T is not None and 'delta_T' not in already_processed:
-            already_processed.add('delta_T')
-            showIndent(outfile, level)
-            outfile.write('delta_T=%e,\n' % (self.delta_T,))
-        if self.b is not None and 'b' not in already_processed:
-            already_processed.add('b')
-            showIndent(outfile, level)
-            outfile.write('b=%e,\n' % (self.b,))
         if self.v_spike is not None and 'v_spike' not in already_processed:
             already_processed.add('v_spike')
-            showIndent(outfile, level)
-            outfile.write('v_spike=%e,\n' % (self.v_spike,))
-        if self.tau_w is not None and 'tau_w' not in already_processed:
-            already_processed.add('tau_w')
-            showIndent(outfile, level)
-            outfile.write('tau_w=%e,\n' % (self.tau_w,))
-        super(EIF_cond_exp_isfa_ista, self).exportLiteralAttributes(outfile, level, already_processed, name_)
-    def exportLiteralChildren(self, outfile, level, name_):
-        super(EIF_cond_exp_isfa_ista, self).exportLiteralChildren(outfile, level, name_)
+            outfile.write(' v_spike="%s"' % self.gds_format_double(self.v_spike, input_name='v_spike'))
+    def exportChildren(self, outfile, level, namespace_='', name_='EIF_cond_exp_isfa_ista', fromsubclass_=False, pretty_print=True):
+        super(EIF_cond_exp_isfa_ista, self).exportChildren(outfile, level, namespace_, name_, True, pretty_print=pretty_print)
     def build(self, node):
         already_processed = set()
         self.buildAttributes(node, node.attrib, already_processed)
         for child in node:
             nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
             self.buildChildren(child, node, nodeName_)
+        return self
     def buildAttributes(self, node, attrs, already_processed):
         value = find_attr_value_('a', node)
         if value is not None and 'a' not in already_processed:
@@ -13890,13 +12261,6 @@ class EIF_cond_exp_isfa_ista(basePyNNIaFCondCell):
                 self.a = float(value)
             except ValueError as exp:
                 raise ValueError('Bad float/double attribute (a): %s' % exp)
-        value = find_attr_value_('delta_T', node)
-        if value is not None and 'delta_T' not in already_processed:
-            already_processed.add('delta_T')
-            try:
-                self.delta_T = float(value)
-            except ValueError as exp:
-                raise ValueError('Bad float/double attribute (delta_T): %s' % exp)
         value = find_attr_value_('b', node)
         if value is not None and 'b' not in already_processed:
             already_processed.add('b')
@@ -13904,13 +12268,13 @@ class EIF_cond_exp_isfa_ista(basePyNNIaFCondCell):
                 self.b = float(value)
             except ValueError as exp:
                 raise ValueError('Bad float/double attribute (b): %s' % exp)
-        value = find_attr_value_('v_spike', node)
-        if value is not None and 'v_spike' not in already_processed:
-            already_processed.add('v_spike')
+        value = find_attr_value_('delta_T', node)
+        if value is not None and 'delta_T' not in already_processed:
+            already_processed.add('delta_T')
             try:
-                self.v_spike = float(value)
+                self.delta_T = float(value)
             except ValueError as exp:
-                raise ValueError('Bad float/double attribute (v_spike): %s' % exp)
+                raise ValueError('Bad float/double attribute (delta_T): %s' % exp)
         value = find_attr_value_('tau_w', node)
         if value is not None and 'tau_w' not in already_processed:
             already_processed.add('tau_w')
@@ -13918,6 +12282,13 @@ class EIF_cond_exp_isfa_ista(basePyNNIaFCondCell):
                 self.tau_w = float(value)
             except ValueError as exp:
                 raise ValueError('Bad float/double attribute (tau_w): %s' % exp)
+        value = find_attr_value_('v_spike', node)
+        if value is not None and 'v_spike' not in already_processed:
+            already_processed.add('v_spike')
+            try:
+                self.v_spike = float(value)
+            except ValueError as exp:
+                raise ValueError('Bad float/double attribute (v_spike): %s' % exp)
         super(EIF_cond_exp_isfa_ista, self).buildAttributes(node, attrs, already_processed)
     def buildChildren(self, child_, node, nodeName_, fromsubclass_=False):
         super(EIF_cond_exp_isfa_ista, self).buildChildren(child_, node, nodeName_, True)
@@ -13928,10 +12299,15 @@ class EIF_cond_exp_isfa_ista(basePyNNIaFCondCell):
 class IF_cond_exp(basePyNNIaFCondCell):
     subclass = None
     superclass = basePyNNIaFCondCell
-    def __init__(self, id=None, neuroLexId=None, name=None, metaid=None, notes=None, annotation=None, tau_syn_I=None, tau_syn_E=None, i_offset=None, cm=None, v_init=None, tau_refrac=None, v_thresh=None, tau_m=None, v_reset=None, v_rest=None, e_rev_I=None, e_rev_E=None):
-        super(IF_cond_exp, self).__init__(id, neuroLexId, name, metaid, notes, annotation, tau_syn_I, tau_syn_E, i_offset, cm, v_init, tau_refrac, v_thresh, tau_m, v_reset, v_rest, e_rev_I, e_rev_E, )
-        pass
+    def __init__(self, id=None, neuroLexId=None, name=None, metaid=None, notes=None, annotation=None, cm=None, i_offset=None, tau_syn_E=None, tau_syn_I=None, v_init=None, tau_m=None, tau_refrac=None, v_reset=None, v_rest=None, v_thresh=None, e_rev_E=None, e_rev_I=None):
+        self.original_tagname_ = None
+        super(IF_cond_exp, self).__init__(id, neuroLexId, name, metaid, notes, annotation, cm, i_offset, tau_syn_E, tau_syn_I, v_init, tau_m, tau_refrac, v_reset, v_rest, v_thresh, e_rev_E, e_rev_I, )
     def factory(*args_, **kwargs_):
+        if CurrentSubclassModule_ is not None:
+            subclass = getSubclassFromModule_(
+                CurrentSubclassModule_, IF_cond_exp)
+            if subclass is not None:
+                return subclass(*args_, **kwargs_)
         if IF_cond_exp.subclass:
             return IF_cond_exp.subclass(*args_, **kwargs_)
         else:
@@ -13949,13 +12325,15 @@ class IF_cond_exp(basePyNNIaFCondCell):
             eol_ = '\n'
         else:
             eol_ = ''
+        if self.original_tagname_ is not None:
+            name_ = self.original_tagname_
         showIndent(outfile, level, pretty_print)
         outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
         already_processed = set()
         self.exportAttributes(outfile, level, already_processed, namespace_, name_='IF_cond_exp')
         if self.hasContent_():
             outfile.write('>%s' % (eol_, ))
-            self.exportChildren(outfile, level + 1, namespace_, name_, pretty_print=pretty_print)
+            self.exportChildren(outfile, level + 1, namespace_='', name_='IF_cond_exp', pretty_print=pretty_print)
             showIndent(outfile, level, pretty_print)
             outfile.write('</%s%s>%s' % (namespace_, name_, eol_))
         else:
@@ -13964,22 +12342,13 @@ class IF_cond_exp(basePyNNIaFCondCell):
         super(IF_cond_exp, self).exportAttributes(outfile, level, already_processed, namespace_, name_='IF_cond_exp')
     def exportChildren(self, outfile, level, namespace_='', name_='IF_cond_exp', fromsubclass_=False, pretty_print=True):
         super(IF_cond_exp, self).exportChildren(outfile, level, namespace_, name_, True, pretty_print=pretty_print)
-    def exportLiteral(self, outfile, level, name_='IF_cond_exp'):
-        level += 1
-        already_processed = set()
-        self.exportLiteralAttributes(outfile, level, already_processed, name_)
-        if self.hasContent_():
-            self.exportLiteralChildren(outfile, level, name_)
-    def exportLiteralAttributes(self, outfile, level, already_processed, name_):
-        super(IF_cond_exp, self).exportLiteralAttributes(outfile, level, already_processed, name_)
-    def exportLiteralChildren(self, outfile, level, name_):
-        super(IF_cond_exp, self).exportLiteralChildren(outfile, level, name_)
     def build(self, node):
         already_processed = set()
         self.buildAttributes(node, node.attrib, already_processed)
         for child in node:
             nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
             self.buildChildren(child, node, nodeName_)
+        return self
     def buildAttributes(self, node, attrs, already_processed):
         super(IF_cond_exp, self).buildAttributes(node, attrs, already_processed)
     def buildChildren(self, child_, node, nodeName_, fromsubclass_=False):
@@ -13991,10 +12360,15 @@ class IF_cond_exp(basePyNNIaFCondCell):
 class IF_cond_alpha(basePyNNIaFCondCell):
     subclass = None
     superclass = basePyNNIaFCondCell
-    def __init__(self, id=None, neuroLexId=None, name=None, metaid=None, notes=None, annotation=None, tau_syn_I=None, tau_syn_E=None, i_offset=None, cm=None, v_init=None, tau_refrac=None, v_thresh=None, tau_m=None, v_reset=None, v_rest=None, e_rev_I=None, e_rev_E=None):
-        super(IF_cond_alpha, self).__init__(id, neuroLexId, name, metaid, notes, annotation, tau_syn_I, tau_syn_E, i_offset, cm, v_init, tau_refrac, v_thresh, tau_m, v_reset, v_rest, e_rev_I, e_rev_E, )
-        pass
+    def __init__(self, id=None, neuroLexId=None, name=None, metaid=None, notes=None, annotation=None, cm=None, i_offset=None, tau_syn_E=None, tau_syn_I=None, v_init=None, tau_m=None, tau_refrac=None, v_reset=None, v_rest=None, v_thresh=None, e_rev_E=None, e_rev_I=None):
+        self.original_tagname_ = None
+        super(IF_cond_alpha, self).__init__(id, neuroLexId, name, metaid, notes, annotation, cm, i_offset, tau_syn_E, tau_syn_I, v_init, tau_m, tau_refrac, v_reset, v_rest, v_thresh, e_rev_E, e_rev_I, )
     def factory(*args_, **kwargs_):
+        if CurrentSubclassModule_ is not None:
+            subclass = getSubclassFromModule_(
+                CurrentSubclassModule_, IF_cond_alpha)
+            if subclass is not None:
+                return subclass(*args_, **kwargs_)
         if IF_cond_alpha.subclass:
             return IF_cond_alpha.subclass(*args_, **kwargs_)
         else:
@@ -14012,13 +12386,15 @@ class IF_cond_alpha(basePyNNIaFCondCell):
             eol_ = '\n'
         else:
             eol_ = ''
+        if self.original_tagname_ is not None:
+            name_ = self.original_tagname_
         showIndent(outfile, level, pretty_print)
         outfile.write('<%s%s%s' % (namespace_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
         already_processed = set()
         self.exportAttributes(outfile, level, already_processed, namespace_, name_='IF_cond_alpha')
         if self.hasContent_():
             outfile.write('>%s' % (eol_, ))
-            self.exportChildren(outfile, level + 1, namespace_, name_, pretty_print=pretty_print)
+            self.exportChildren(outfile, level + 1, namespace_='', name_='IF_cond_alpha', pretty_print=pretty_print)
             showIndent(outfile, level, pretty_print)
             outfile.write('</%s%s>%s' % (namespace_, name_, eol_))
         else:
@@ -14027,22 +12403,13 @@ class IF_cond_alpha(basePyNNIaFCondCell):
         super(IF_cond_alpha, self).exportAttributes(outfile, level, already_processed, namespace_, name_='IF_cond_alpha')
     def exportChildren(self, outfile, level, namespace_='', name_='IF_cond_alpha', fromsubclass_=False, pretty_print=True):
         super(IF_cond_alpha, self).exportChildren(outfile, level, namespace_, name_, True, pretty_print=pretty_print)
-    def exportLiteral(self, outfile, level, name_='IF_cond_alpha'):
-        level += 1
-        already_processed = set()
-        self.exportLiteralAttributes(outfile, level, already_processed, name_)
-        if self.hasContent_():
-            self.exportLiteralChildren(outfile, level, name_)
-    def exportLiteralAttributes(self, outfile, level, already_processed, name_):
-        super(IF_cond_alpha, self).exportLiteralAttributes(outfile, level, already_processed, name_)
-    def exportLiteralChildren(self, outfile, level, name_):
-        super(IF_cond_alpha, self).exportLiteralChildren(outfile, level, name_)
     def build(self, node):
         already_processed = set()
         self.buildAttributes(node, node.attrib, already_processed)
         for child in node:
             nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
             self.buildChildren(child, node, nodeName_)
+        return self
     def buildAttributes(self, node, attrs, already_processed):
         super(IF_cond_alpha, self).buildAttributes(node, attrs, already_processed)
     def buildChildren(self, child_, node, nodeName_, fromsubclass_=False):
@@ -14052,91 +12419,91 @@ class IF_cond_alpha(basePyNNIaFCondCell):
 
 
 GDSClassesMapping = {
-    'intracellularProperties': IntracellularProperties,
-    'inhomogeneousParam': InhomogeneousParam,
-    'q10Settings': Q10Settings,
-    'spikeGenerator': SpikeGenerator,
+    'adExIaFCell': AdExIaFCell,
+    'alphaCondSynapse': AlphaCondSynapse,
+    'alphaCurrSynapse': AlphaCurrSynapse,
+    'annotation': Annotation,
+    'baseCell': BaseCell,
+    'biophysicalProperties': BiophysicalProperties,
+    'blockMechanism': BlockMechanism,
+    'blockingPlasticSynapse': BlockingPlasticSynapse,
+    'cell': Cell,
+    'cellSet': CellSet,
+    'channelDensity': ChannelDensity,
+    'channelPopulation': ChannelPopulation,
+    'connection': Connection,
+    'decayingPoolConcentrationModel': DecayingPoolConcentrationModel,
     'distal': DistalDetails,
-    'random': RandomLayout,
-    'variableParameter': VariableParameter,
-    'subTree': SubTree,
-    'gateHHtauInf': GateHHTauInf,
-    'inputList': InputList,
-    'specificCapacitance': ValueAcrossSegOrSegGroup,
-    'ionChannel': IonChannel,
+    'expCondSynapse': ExpCondSynapse,
+    'expCurrSynapse': ExpCurrSynapse,
+    'expOneSynapse': ExpOneSynapse,
+    'expTwoSynapse': ExpTwoSynapse,
+    'explicitInput': ExplicitInput,
+    'extracellularProperties': ExtracellularPropertiesLocal,
+    'forwardRate': HHRate,
+    'from': SegmentEndPoint,
+    'gate': GateHHUndetermined,
+    'gateHHrates': GateHHRates,
+    'gateHHratesInf': GateHHRatesInf,
     'gateHHratesTau': GateHHRatesTau,
-    'biophysicalProperties': BiophysicalProperties,
-    'membraneProperties': MembraneProperties,
-    'proximal': ProximalDetails,
-    'path': Path,
-    'morphology': Morphology,
+    'gateHHtauInf': GateHHTauInf,
+    'grid': GridLayout,
     'iafCell': IaFCell,
+    'iafRefCell': IaFRefCell,
+    'iafTauCell': IaFTauCell,
     'iafTauRefCell': IaFTauRefCell,
-    'species': Species,
-    'resistivity': ValueAcrossSegOrSegGroup,
-    'member': Member,
+    'include': Include,
+    'inhomogeneousParam': InhomogeneousParam,
     'inhomogeneousValue': InhomogeneousValue,
-    'spikeGeneratorRandom': SpikeGeneratorRandom,
-    'sineGenerator': SineGenerator,
-    'expCondSynapse': ExpCondSynapse,
-    'network': Network,
-    'reverseRate': HHRate,
-    'decayingPoolConcentrationModel': DecayingPoolConcentrationModel,
-    'segment': Segment,
-    'rampGenerator': RampGenerator,
-    'cellSet': CellSet,
-    'gateHHrates': GateHHRates,
-    'cell': Cell,
-    'to': SegmentEndPoint,
-    'voltageClamp': VoltageClamp,
     'initMembPotential': ValueAcrossSegOrSegGroup,
-    'projection': Projection,
-    'spike': Spike,
-    'gate': GateHHUndetermined,
-    'steadyState': HHVariable,
-    'include': Include,
-    'forwardRate': HHRate,
+    'input': Input,
+    'inputList': InputList,
+    'instance': Instance,
+    'intracellularProperties': IntracellularProperties,
+    'ionChannel': IonChannel,
+    'izhikevichCell': IzhikevichCell,
+    'layout': Layout,
     'location': Location,
-    'synapticConnection': SynapticConnection,
+    'member': Member,
+    'membraneProperties': MembraneProperties,
+    'morphology': Morphology,
+    'network': Network,
     'neuroml': NeuroMLDocument,
-    'from': SegmentEndPoint,
-    'blockMechanism': BlockMechanism,
-    'gateHHratesInf': GateHHRatesInf,
     'parent': SegmentParent,
+    'path': Path,
     'plasticityMechanism': PlasticityMechanism,
-    'spikeThresh': ValueAcrossSegOrSegGroup,
-    'annotation': Annotation,
-    'instance': Instance,
-    'adExIaFCell': AdExIaFCell,
-    'grid': GridLayout,
-    'alphaCondSynapse': AlphaCondSynapse,
-    'izhikevichCell': IzhikevichCell,
-    'input': Input,
-    'iafTauCell': IaFTauCell,
-    'segmentGroup': SegmentGroup,
-    'expTwoSynapse': ExpTwoSynapse,
+    'population': Population,
+    'projection': Projection,
+    'proximal': ProximalDetails,
     'pulseGenerator': PulseGenerator,
-    'iafRefCell': IaFRefCell,
-    'structure': SpaceStructure,
-    'spikeArray': SpikeArray,
-    'unstructured': UnstructuredLayout,
-    'blockingPlasticSynapse': BlockingPlasticSynapse,
-    'reversalPotential': ReversalPotential,
-    'channelPopulation': ChannelPopulation,
-    'alphaCurrSynapse': AlphaCurrSynapse,
+    'q10Settings': Q10Settings,
+    'rampGenerator': RampGenerator,
+    'random': RandomLayout,
     'region': Region,
+    'resistivity': ValueAcrossSegOrSegGroup,
+    'reversalPotential': ReversalPotential,
+    'reverseRate': HHRate,
+    'segment': Segment,
+    'segmentGroup': SegmentGroup,
+    'sineGenerator': SineGenerator,
     'space': Space,
-    'expCurrSynapse': ExpCurrSynapse,
-    'population': Population,
-    'timeCourse': HHTime,
-    'explicitInput': ExplicitInput,
-    'extracellularProperties': ExtracellularPropertiesLocal,
-    'connection': Connection,
+    'species': Species,
+    'specificCapacitance': ValueAcrossSegOrSegGroup,
+    'spike': Spike,
+    'spikeArray': SpikeArray,
+    'spikeGenerator': SpikeGenerator,
     'spikeGeneratorPoisson': SpikeGeneratorPoisson,
-    'channelDensity': ChannelDensity,
-    'expOneSynapse': ExpOneSynapse,
-    'layout': Layout,
-    'baseCell': BaseCell,
+    'spikeGeneratorRandom': SpikeGeneratorRandom,
+    'spikeThresh': ValueAcrossSegOrSegGroup,
+    'steadyState': HHVariable,
+    'structure': SpaceStructure,
+    'subTree': SubTree,
+    'synapticConnection': SynapticConnection,
+    'timeCourse': HHTime,
+    'to': SegmentEndPoint,
+    'unstructured': UnstructuredLayout,
+    'variableParameter': VariableParameter,
+    'voltageClamp': VoltageClamp,
 }
 
 
@@ -14158,8 +12525,9 @@ def get_root_tag(node):
     return tag, rootClass
 
 
-def parse(inFileName):
-    doc = parsexml_(inFileName)
+def parse(inFileName, silence=False):
+    parser = None
+    doc = parsexml_(inFileName, parser)
     rootNode = doc.getroot()
     rootTag, rootClass = get_root_tag(rootNode)
     if rootClass is None:
@@ -14169,16 +12537,18 @@ def parse(inFileName):
     rootObj.build(rootNode)
     # Enable Python to collect the space used by the DOM.
     doc = None
-##     sys.stdout.write('<?xml version="1.0" ?>\n')
-##     rootObj.export(
-##         sys.stdout, 0, name_=rootTag,
-##         namespacedef_='',
-##         pretty_print=True)
+    if not silence:
+        sys.stdout.write('<?xml version="1.0" ?>\n')
+        rootObj.export(
+            sys.stdout, 0, name_=rootTag,
+            namespacedef_='',
+            pretty_print=True)
     return rootObj
 
 
-def parseEtree(inFileName):
-    doc = parsexml_(inFileName)
+def parseEtree(inFileName, silence=False):
+    parser = None
+    doc = parsexml_(inFileName, parser)
     rootNode = doc.getroot()
     rootTag, rootClass = get_root_tag(rootNode)
     if rootClass is None:
@@ -14191,35 +12561,39 @@ def parseEtree(inFileName):
     mapping = {}
     rootElement = rootObj.to_etree(None, name_=rootTag, mapping_=mapping)
     reverse_mapping = rootObj.gds_reverse_node_mapping(mapping)
-##     content = etree_.tostring(
-##         rootElement, pretty_print=True,
-##         xml_declaration=True, encoding="utf-8")
-##     sys.stdout.write(content)
-##     sys.stdout.write('\n')
+    if not silence:
+        content = etree_.tostring(
+            rootElement, pretty_print=True,
+            xml_declaration=True, encoding="utf-8")
+        sys.stdout.write(content)
+        sys.stdout.write('\n')
     return rootObj, rootElement, mapping, reverse_mapping
 
 
-def parseString(inString):
-    from io import StringIO
-    doc = parsexml_(StringIO(inString))
+def parseString(inString, silence=False):
+    from StringIO import StringIO
+    parser = None
+    doc = parsexml_(StringIO(inString), parser)
     rootNode = doc.getroot()
-    roots = get_root_tag(rootNode)
-    rootClass = roots[1]
+    rootTag, rootClass = get_root_tag(rootNode)
     if rootClass is None:
+        rootTag = 'Annotation'
         rootClass = Annotation
     rootObj = rootClass.factory()
     rootObj.build(rootNode)
     # Enable Python to collect the space used by the DOM.
     doc = None
-##     sys.stdout.write('<?xml version="1.0" ?>\n')
-##     rootObj.export(
-##         sys.stdout, 0, name_="Annotation",
-##         namespacedef_='')
+    if not silence:
+        sys.stdout.write('<?xml version="1.0" ?>\n')
+        rootObj.export(
+            sys.stdout, 0, name_=rootTag,
+            namespacedef_='')
     return rootObj
 
 
-def parseLiteral(inFileName):
-    doc = parsexml_(inFileName)
+def parseLiteral(inFileName, silence=False):
+    parser = None
+    doc = parsexml_(inFileName, parser)
     rootNode = doc.getroot()
     rootTag, rootClass = get_root_tag(rootNode)
     if rootClass is None:
@@ -14229,11 +12603,12 @@ def parseLiteral(inFileName):
     rootObj.build(rootNode)
     # Enable Python to collect the space used by the DOM.
     doc = None
-##     sys.stdout.write('#from generated_neuroml import *\n\n')
-##     sys.stdout.write('import generated_neuroml as model_\n\n')
-##     sys.stdout.write('rootObj = model_.rootTag(\n')
-##     rootObj.exportLiteral(sys.stdout, 0, name_=rootTag)
-##     sys.stdout.write(')\n')
+    if not silence:
+        sys.stdout.write('#from generated_neuroml import *\n\n')
+        sys.stdout.write('import generated_neuroml as model_\n\n')
+        sys.stdout.write('rootObj = model_.rootClass(\n')
+        rootObj.exportLiteral(sys.stdout, 0, name_=rootTag)
+        sys.stdout.write(')\n')
     return rootObj
 
 
diff --git a/python/moose/neuroml2/generated_neuromlsub.py b/python/moose/neuroml2/generated_neuromlsub.py
index 5b2eb233..3f066bdd 100644
--- a/python/moose/neuroml2/generated_neuromlsub.py
+++ b/python/moose/neuroml2/generated_neuromlsub.py
@@ -1,64 +1,33 @@
 #!/usr/bin/env python
 
 #
-# Generated Sun Jul 28 10:18:38 2013 by generateDS.py version 2.10a.
+# Generated Sun Apr 17 15:01:32 2016 by generateDS.py version 2.22a.
+#
+# Command line options:
+#   ('-o', 'generated_neuroml.py')
+#   ('-s', 'generated_neuromlsub.py')
+#
+# Command line arguments:
+#   /home/subha/src/neuroml_dev/NeuroML2/Schemas/NeuroML2/NeuroML_v2beta.xsd
+#
+# Command line:
+#   /home/subha/.local/bin/generateDS.py -o "generated_neuroml.py" -s "generated_neuromlsub.py" /home/subha/src/neuroml_dev/NeuroML2/Schemas/NeuroML2/NeuroML_v2beta.xsd
+#
+# Current working directory (os.getcwd()):
+#   neuroml2
 #
 
 import sys
+from lxml import etree as etree_
 
-etree_ = None
-Verbose_import_ = False
-(
-    XMLParser_import_none, XMLParser_import_lxml,
-    XMLParser_import_elementtree
-) = range(3)
-XMLParser_import_library = None
-try:
-    # lxml
-    from lxml import etree as etree_
-    XMLParser_import_library = XMLParser_import_lxml
-    if Verbose_import_:
-        print("running with lxml.etree")
-except ImportError:
-    try:
-        # cElementTree from Python 2.5+
-        import xml.etree.cElementTree as etree_
-        XMLParser_import_library = XMLParser_import_elementtree
-        if Verbose_import_:
-            print("running with cElementTree on Python 2.5+")
-    except ImportError:
-        try:
-            # ElementTree from Python 2.5+
-            import xml.etree.ElementTree as etree_
-            XMLParser_import_library = XMLParser_import_elementtree
-            if Verbose_import_:
-                print("running with ElementTree on Python 2.5+")
-        except ImportError:
-            try:
-                # normal cElementTree install
-                import cElementTree as etree_
-                XMLParser_import_library = XMLParser_import_elementtree
-                if Verbose_import_:
-                    print("running with cElementTree")
-            except ImportError:
-                try:
-                    # normal ElementTree install
-                    import elementtree.ElementTree as etree_
-                    XMLParser_import_library = XMLParser_import_elementtree
-                    if Verbose_import_:
-                        print("running with ElementTree")
-                except ImportError:
-                    raise ImportError(
-                        "Failed to import ElementTree from any known place")
-
-
-def parsexml_(*args, **kwargs):
-    if (XMLParser_import_library == XMLParser_import_lxml and
-            'parser' not in kwargs):
+import ??? as supermod
+
+def parsexml_(infile, parser=None, **kwargs):
+    if parser is None:
         # Use the lxml ElementTree compatible parser so that, e.g.,
         #   we ignore comments.
-        kwargs['parser'] = etree_.ETCompatXMLParser()
-    doc = etree_.parse(*args, **kwargs)
+        parser = etree_.ETCompatXMLParser()
+    doc = etree_.parse(infile, parser=parser, **kwargs)
     return doc
 
 #
@@ -80,8 +49,8 @@ supermod.Annotation.subclass = AnnotationSub
 
 
 class ComponentTypeSub(supermod.ComponentType):
-    def __init__(self, extends=None, name=None, description=None, anytypeobjs_=None):
-        super(ComponentTypeSub, self).__init__(extends, name, description, anytypeobjs_, )
+    def __init__(self, name=None, extends=None, description=None, anytypeobjs_=None):
+        super(ComponentTypeSub, self).__init__(name, extends, description, anytypeobjs_, )
 supermod.ComponentType.subclass = ComponentTypeSub
 # end class ComponentTypeSub
 
@@ -94,57 +63,57 @@ supermod.IncludeType.subclass = IncludeTypeSub
 
 
 class Q10SettingsSub(supermod.Q10Settings):
-    def __init__(self, fixedQ10=None, experimentalTemp=None, type_=None, q10Factor=None):
-        super(Q10SettingsSub, self).__init__(fixedQ10, experimentalTemp, type_, q10Factor, )
+    def __init__(self, type_=None, fixedQ10=None, q10Factor=None, experimentalTemp=None):
+        super(Q10SettingsSub, self).__init__(type_, fixedQ10, q10Factor, experimentalTemp, )
 supermod.Q10Settings.subclass = Q10SettingsSub
 # end class Q10SettingsSub
 
 
 class HHRateSub(supermod.HHRate):
-    def __init__(self, midpoint=None, rate=None, scale=None, type_=None):
-        super(HHRateSub, self).__init__(midpoint, rate, scale, type_, )
+    def __init__(self, type_=None, rate=None, midpoint=None, scale=None):
+        super(HHRateSub, self).__init__(type_, rate, midpoint, scale, )
 supermod.HHRate.subclass = HHRateSub
 # end class HHRateSub
 
 
 class HHVariableSub(supermod.HHVariable):
-    def __init__(self, midpoint=None, rate=None, scale=None, type_=None):
-        super(HHVariableSub, self).__init__(midpoint, rate, scale, type_, )
+    def __init__(self, type_=None, rate=None, midpoint=None, scale=None):
+        super(HHVariableSub, self).__init__(type_, rate, midpoint, scale, )
 supermod.HHVariable.subclass = HHVariableSub
 # end class HHVariableSub
 
 
 class HHTimeSub(supermod.HHTime):
-    def __init__(self, midpoint=None, rate=None, scale=None, type_=None, tau=None):
-        super(HHTimeSub, self).__init__(midpoint, rate, scale, type_, tau, )
+    def __init__(self, type_=None, rate=None, midpoint=None, scale=None, tau=None):
+        super(HHTimeSub, self).__init__(type_, rate, midpoint, scale, tau, )
 supermod.HHTime.subclass = HHTimeSub
 # end class HHTimeSub
 
 
 class BlockMechanismSub(supermod.BlockMechanism):
-    def __init__(self, blockConcentration=None, scalingConc=None, type_=None, species=None, scalingVolt=None):
-        super(BlockMechanismSub, self).__init__(blockConcentration, scalingConc, type_, species, scalingVolt, )
+    def __init__(self, type_=None, species=None, blockConcentration=None, scalingConc=None, scalingVolt=None):
+        super(BlockMechanismSub, self).__init__(type_, species, blockConcentration, scalingConc, scalingVolt, )
 supermod.BlockMechanism.subclass = BlockMechanismSub
 # end class BlockMechanismSub
 
 
 class PlasticityMechanismSub(supermod.PlasticityMechanism):
-    def __init__(self, type_=None, tauFac=None, tauRec=None, initReleaseProb=None):
-        super(PlasticityMechanismSub, self).__init__(type_, tauFac, tauRec, initReleaseProb, )
+    def __init__(self, type_=None, initReleaseProb=None, tauRec=None, tauFac=None):
+        super(PlasticityMechanismSub, self).__init__(type_, initReleaseProb, tauRec, tauFac, )
 supermod.PlasticityMechanism.subclass = PlasticityMechanismSub
 # end class PlasticityMechanismSub
 
 
 class SegmentParentSub(supermod.SegmentParent):
-    def __init__(self, fractionAlong='1', segment=None):
-        super(SegmentParentSub, self).__init__(fractionAlong, segment, )
+    def __init__(self, segment=None, fractionAlong='1'):
+        super(SegmentParentSub, self).__init__(segment, fractionAlong, )
 supermod.SegmentParent.subclass = SegmentParentSub
 # end class SegmentParentSub
 
 
 class Point3DWithDiamSub(supermod.Point3DWithDiam):
-    def __init__(self, y=None, x=None, z=None, diameter=None):
-        super(Point3DWithDiamSub, self).__init__(y, x, z, diameter, )
+    def __init__(self, x=None, y=None, z=None, diameter=None):
+        super(Point3DWithDiamSub, self).__init__(x, y, z, diameter, )
 supermod.Point3DWithDiam.subclass = Point3DWithDiamSub
 # end class Point3DWithDiamSub
 
@@ -178,15 +147,15 @@ supermod.Include.subclass = IncludeSub
 
 
 class PathSub(supermod.Path):
-    def __init__(self, fromxx=None, to=None):
-        super(PathSub, self).__init__(fromxx, to, )
+    def __init__(self, from_=None, to=None):
+        super(PathSub, self).__init__(from_, to, )
 supermod.Path.subclass = PathSub
 # end class PathSub
 
 
 class SubTreeSub(supermod.SubTree):
-    def __init__(self, fromxx=None, to=None):
-        super(SubTreeSub, self).__init__(fromxx, to, )
+    def __init__(self, from_=None, to=None):
+        super(SubTreeSub, self).__init__(from_, to, )
 supermod.SubTree.subclass = SubTreeSub
 # end class SubTreeSub
 
@@ -206,15 +175,15 @@ supermod.MembraneProperties.subclass = MembranePropertiesSub
 
 
 class ValueAcrossSegOrSegGroupSub(supermod.ValueAcrossSegOrSegGroup):
-    def __init__(self, segment=None, segmentGroup='all', value=None, extensiontype_=None):
-        super(ValueAcrossSegOrSegGroupSub, self).__init__(segment, segmentGroup, value, extensiontype_, )
+    def __init__(self, value=None, segmentGroup='all', segment=None, extensiontype_=None):
+        super(ValueAcrossSegOrSegGroupSub, self).__init__(value, segmentGroup, segment, extensiontype_, )
 supermod.ValueAcrossSegOrSegGroup.subclass = ValueAcrossSegOrSegGroupSub
 # end class ValueAcrossSegOrSegGroupSub
 
 
 class VariableParameterSub(supermod.VariableParameter):
-    def __init__(self, segmentGroup=None, parameter=None, inhomogeneousValue=None):
-        super(VariableParameterSub, self).__init__(segmentGroup, parameter, inhomogeneousValue, )
+    def __init__(self, parameter=None, segmentGroup=None, inhomogeneousValue=None):
+        super(VariableParameterSub, self).__init__(parameter, segmentGroup, inhomogeneousValue, )
 supermod.VariableParameter.subclass = VariableParameterSub
 # end class VariableParameterSub
 
@@ -227,15 +196,15 @@ supermod.InhomogeneousValue.subclass = InhomogeneousValueSub
 
 
 class ReversalPotentialSub(supermod.ReversalPotential):
-    def __init__(self, segment=None, segmentGroup='all', value=None, species=None):
-        super(ReversalPotentialSub, self).__init__(segment, segmentGroup, value, species, )
+    def __init__(self, value=None, segmentGroup='all', segment=None, species=None):
+        super(ReversalPotentialSub, self).__init__(value, segmentGroup, segment, species, )
 supermod.ReversalPotential.subclass = ReversalPotentialSub
 # end class ReversalPotentialSub
 
 
 class SpeciesSub(supermod.Species):
-    def __init__(self, segment=None, segmentGroup='all', value=None, ion=None, initialExtConcentration=None, concentrationModel=None, id=None, initialConcentration=None):
-        super(SpeciesSub, self).__init__(segment, segmentGroup, value, ion, initialExtConcentration, concentrationModel, id, initialConcentration, )
+    def __init__(self, value=None, segmentGroup='all', segment=None, id=None, concentrationModel=None, ion=None, initialConcentration=None, initialExtConcentration=None):
+        super(SpeciesSub, self).__init__(value, segmentGroup, segment, id, concentrationModel, ion, initialConcentration, initialExtConcentration, )
 supermod.Species.subclass = SpeciesSub
 # end class SpeciesSub
 
@@ -255,8 +224,8 @@ supermod.ExtracellularPropertiesLocal.subclass = ExtracellularPropertiesLocalSub
 
 
 class SpaceStructureSub(supermod.SpaceStructure):
-    def __init__(self, ySpacing=None, zStart=0, yStart=0, zSpacing=None, xStart=0, xSpacing=None):
-        super(SpaceStructureSub, self).__init__(ySpacing, zStart, yStart, zSpacing, xStart, xSpacing, )
+    def __init__(self, xSpacing=None, ySpacing=None, zSpacing=None, xStart=0, yStart=0, zStart=0):
+        super(SpaceStructureSub, self).__init__(xSpacing, ySpacing, zSpacing, xStart, yStart, zStart, )
 supermod.SpaceStructure.subclass = SpaceStructureSub
 # end class SpaceStructureSub
 
@@ -276,57 +245,57 @@ supermod.UnstructuredLayout.subclass = UnstructuredLayoutSub
 
 
 class RandomLayoutSub(supermod.RandomLayout):
-    def __init__(self, region=None, number=None):
-        super(RandomLayoutSub, self).__init__(region, number, )
+    def __init__(self, number=None, region=None):
+        super(RandomLayoutSub, self).__init__(number, region, )
 supermod.RandomLayout.subclass = RandomLayoutSub
 # end class RandomLayoutSub
 
 
 class GridLayoutSub(supermod.GridLayout):
-    def __init__(self, zSize=None, ySize=None, xSize=None):
-        super(GridLayoutSub, self).__init__(zSize, ySize, xSize, )
+    def __init__(self, xSize=None, ySize=None, zSize=None):
+        super(GridLayoutSub, self).__init__(xSize, ySize, zSize, )
 supermod.GridLayout.subclass = GridLayoutSub
 # end class GridLayoutSub
 
 
 class InstanceSub(supermod.Instance):
-    def __init__(self, i=None, k=None, j=None, id=None, location=None):
-        super(InstanceSub, self).__init__(i, k, j, id, location, )
+    def __init__(self, id=None, i=None, j=None, k=None, location=None):
+        super(InstanceSub, self).__init__(id, i, j, k, location, )
 supermod.Instance.subclass = InstanceSub
 # end class InstanceSub
 
 
 class LocationSub(supermod.Location):
-    def __init__(self, y=None, x=None, z=None):
-        super(LocationSub, self).__init__(y, x, z, )
+    def __init__(self, x=None, y=None, z=None):
+        super(LocationSub, self).__init__(x, y, z, )
 supermod.Location.subclass = LocationSub
 # end class LocationSub
 
 
 class SynapticConnectionSub(supermod.SynapticConnection):
-    def __init__(self, to=None, synapse=None, fromxx=None):
-        super(SynapticConnectionSub, self).__init__(to, synapse, fromxx, )
+    def __init__(self, from_=None, to=None, synapse=None):
+        super(SynapticConnectionSub, self).__init__(from_, to, synapse, )
 supermod.SynapticConnection.subclass = SynapticConnectionSub
 # end class SynapticConnectionSub
 
 
 class ConnectionSub(supermod.Connection):
-    def __init__(self, postCellId=None, id=None, preCellId=None):
-        super(ConnectionSub, self).__init__(postCellId, id, preCellId, )
+    def __init__(self, id=None, preCellId=None, postCellId=None):
+        super(ConnectionSub, self).__init__(id, preCellId, postCellId, )
 supermod.Connection.subclass = ConnectionSub
 # end class ConnectionSub
 
 
 class ExplicitInputSub(supermod.ExplicitInput):
-    def __init__(self, input=None, destination=None, target=None):
-        super(ExplicitInputSub, self).__init__(input, destination, target, )
+    def __init__(self, target=None, input=None, destination=None):
+        super(ExplicitInputSub, self).__init__(target, input, destination, )
 supermod.ExplicitInput.subclass = ExplicitInputSub
 # end class ExplicitInputSub
 
 
 class InputSub(supermod.Input):
-    def __init__(self, destination=None, id=None, target=None):
-        super(InputSub, self).__init__(destination, id, target, )
+    def __init__(self, id=None, target=None, destination=None):
+        super(InputSub, self).__init__(id, target, destination, )
 supermod.Input.subclass = InputSub
 # end class InputSub
 
@@ -346,22 +315,22 @@ supermod.Standalone.subclass = StandaloneSub
 
 
 class SpikeSourcePoissonSub(supermod.SpikeSourcePoisson):
-    def __init__(self, id=None, neuroLexId=None, name=None, metaid=None, notes=None, annotation=None, duration=None, start=None, rate=None):
-        super(SpikeSourcePoissonSub, self).__init__(id, neuroLexId, name, metaid, notes, annotation, duration, start, rate, )
+    def __init__(self, id=None, neuroLexId=None, name=None, metaid=None, notes=None, annotation=None, start=None, duration=None, rate=None):
+        super(SpikeSourcePoissonSub, self).__init__(id, neuroLexId, name, metaid, notes, annotation, start, duration, rate, )
 supermod.SpikeSourcePoisson.subclass = SpikeSourcePoissonSub
 # end class SpikeSourcePoissonSub
 
 
 class InputListSub(supermod.InputList):
-    def __init__(self, id=None, neuroLexId=None, component=None, population=None, input=None):
-        super(InputListSub, self).__init__(id, neuroLexId, component, population, input, )
+    def __init__(self, id=None, neuroLexId=None, population=None, component=None, input=None):
+        super(InputListSub, self).__init__(id, neuroLexId, population, component, input, )
 supermod.InputList.subclass = InputListSub
 # end class InputListSub
 
 
 class ProjectionSub(supermod.Projection):
-    def __init__(self, id=None, neuroLexId=None, postsynapticPopulation=None, presynapticPopulation=None, synapse=None, connection=None):
-        super(ProjectionSub, self).__init__(id, neuroLexId, postsynapticPopulation, presynapticPopulation, synapse, connection, )
+    def __init__(self, id=None, neuroLexId=None, presynapticPopulation=None, postsynapticPopulation=None, synapse=None, connection=None):
+        super(ProjectionSub, self).__init__(id, neuroLexId, presynapticPopulation, postsynapticPopulation, synapse, connection, )
 supermod.Projection.subclass = ProjectionSub
 # end class ProjectionSub
 
@@ -374,8 +343,8 @@ supermod.CellSet.subclass = CellSetSub
 
 
 class PopulationSub(supermod.Population):
-    def __init__(self, id=None, neuroLexId=None, name=None, metaid=None, notes=None, annotation=None, extracellularProperties=None, network=None, component=None, cell=None, type_=None, size=None, layout=None, instance=None):
-        super(PopulationSub, self).__init__(id, neuroLexId, name, metaid, notes, annotation, extracellularProperties, network, component, cell, type_, size, layout, instance, )
+    def __init__(self, id=None, neuroLexId=None, name=None, metaid=None, notes=None, annotation=None, cell=None, network=None, component=None, size=None, type_=None, extracellularProperties=None, layout=None, instance=None):
+        super(PopulationSub, self).__init__(id, neuroLexId, name, metaid, notes, annotation, cell, network, component, size, type_, extracellularProperties, layout, instance, )
 supermod.Population.subclass = PopulationSub
 # end class PopulationSub
 
@@ -409,8 +378,8 @@ supermod.SpikeGeneratorPoisson.subclass = SpikeGeneratorPoissonSub
 
 
 class SpikeGeneratorRandomSub(supermod.SpikeGeneratorRandom):
-    def __init__(self, id=None, neuroLexId=None, name=None, metaid=None, notes=None, annotation=None, minISI=None, maxISI=None):
-        super(SpikeGeneratorRandomSub, self).__init__(id, neuroLexId, name, metaid, notes, annotation, minISI, maxISI, )
+    def __init__(self, id=None, neuroLexId=None, name=None, metaid=None, notes=None, annotation=None, maxISI=None, minISI=None):
+        super(SpikeGeneratorRandomSub, self).__init__(id, neuroLexId, name, metaid, notes, annotation, maxISI, minISI, )
 supermod.SpikeGeneratorRandom.subclass = SpikeGeneratorRandomSub
 # end class SpikeGeneratorRandomSub
 
@@ -437,22 +406,22 @@ supermod.Spike.subclass = SpikeSub
 
 
 class VoltageClampSub(supermod.VoltageClamp):
-    def __init__(self, id=None, neuroLexId=None, name=None, metaid=None, notes=None, annotation=None, delay=None, duration=None, seriesResistance=None, targetVoltage=None):
-        super(VoltageClampSub, self).__init__(id, neuroLexId, name, metaid, notes, annotation, delay, duration, seriesResistance, targetVoltage, )
+    def __init__(self, id=None, neuroLexId=None, name=None, metaid=None, notes=None, annotation=None, delay=None, duration=None, targetVoltage=None, seriesResistance=None):
+        super(VoltageClampSub, self).__init__(id, neuroLexId, name, metaid, notes, annotation, delay, duration, targetVoltage, seriesResistance, )
 supermod.VoltageClamp.subclass = VoltageClampSub
 # end class VoltageClampSub
 
 
 class RampGeneratorSub(supermod.RampGenerator):
-    def __init__(self, id=None, neuroLexId=None, name=None, metaid=None, notes=None, annotation=None, delay=None, duration=None, baselineAmplitude=None, startAmplitude=None, finishAmplitude=None):
-        super(RampGeneratorSub, self).__init__(id, neuroLexId, name, metaid, notes, annotation, delay, duration, baselineAmplitude, startAmplitude, finishAmplitude, )
+    def __init__(self, id=None, neuroLexId=None, name=None, metaid=None, notes=None, annotation=None, delay=None, duration=None, startAmplitude=None, finishAmplitude=None, baselineAmplitude=None):
+        super(RampGeneratorSub, self).__init__(id, neuroLexId, name, metaid, notes, annotation, delay, duration, startAmplitude, finishAmplitude, baselineAmplitude, )
 supermod.RampGenerator.subclass = RampGeneratorSub
 # end class RampGeneratorSub
 
 
 class SineGeneratorSub(supermod.SineGenerator):
-    def __init__(self, id=None, neuroLexId=None, name=None, metaid=None, notes=None, annotation=None, delay=None, phase=None, duration=None, period=None, amplitude=None):
-        super(SineGeneratorSub, self).__init__(id, neuroLexId, name, metaid, notes, annotation, delay, phase, duration, period, amplitude, )
+    def __init__(self, id=None, neuroLexId=None, name=None, metaid=None, notes=None, annotation=None, delay=None, phase=None, duration=None, amplitude=None, period=None):
+        super(SineGeneratorSub, self).__init__(id, neuroLexId, name, metaid, notes, annotation, delay, phase, duration, amplitude, period, )
 supermod.SineGenerator.subclass = SineGeneratorSub
 # end class SineGeneratorSub
 
@@ -479,15 +448,15 @@ supermod.ExtracellularProperties.subclass = ExtracellularPropertiesSub
 
 
 class ChannelDensitySub(supermod.ChannelDensity):
-    def __init__(self, id=None, neuroLexId=None, segmentGroup='all', ion=None, ionChannel=None, erev=None, condDensity=None, segment=None, variableParameter=None):
-        super(ChannelDensitySub, self).__init__(id, neuroLexId, segmentGroup, ion, ionChannel, erev, condDensity, segment, variableParameter, )
+    def __init__(self, id=None, neuroLexId=None, ionChannel=None, condDensity=None, erev=None, segmentGroup='all', segment=None, ion=None, variableParameter=None):
+        super(ChannelDensitySub, self).__init__(id, neuroLexId, ionChannel, condDensity, erev, segmentGroup, segment, ion, variableParameter, )
 supermod.ChannelDensity.subclass = ChannelDensitySub
 # end class ChannelDensitySub
 
 
 class ChannelPopulationSub(supermod.ChannelPopulation):
-    def __init__(self, id=None, neuroLexId=None, segmentGroup='all', ion=None, number=None, ionChannel=None, erev=None, segment=None, variableParameter=None):
-        super(ChannelPopulationSub, self).__init__(id, neuroLexId, segmentGroup, ion, number, ionChannel, erev, segment, variableParameter, )
+    def __init__(self, id=None, neuroLexId=None, ionChannel=None, number=None, erev=None, segmentGroup='all', segment=None, ion=None, variableParameter=None):
+        super(ChannelPopulationSub, self).__init__(id, neuroLexId, ionChannel, number, erev, segmentGroup, segment, ion, variableParameter, )
 supermod.ChannelPopulation.subclass = ChannelPopulationSub
 # end class ChannelPopulationSub
 
@@ -542,8 +511,8 @@ supermod.BaseSynapse.subclass = BaseSynapseSub
 
 
 class DecayingPoolConcentrationModelSub(supermod.DecayingPoolConcentrationModel):
-    def __init__(self, id=None, neuroLexId=None, name=None, metaid=None, notes=None, annotation=None, ion=None, shellThickness=None, restingConc=None, decayConstant=None, extensiontype_=None):
-        super(DecayingPoolConcentrationModelSub, self).__init__(id, neuroLexId, name, metaid, notes, annotation, ion, shellThickness, restingConc, decayConstant, extensiontype_, )
+    def __init__(self, id=None, neuroLexId=None, name=None, metaid=None, notes=None, annotation=None, ion=None, restingConc=None, decayConstant=None, shellThickness=None, extensiontype_=None):
+        super(DecayingPoolConcentrationModelSub, self).__init__(id, neuroLexId, name, metaid, notes, annotation, ion, restingConc, decayConstant, shellThickness, extensiontype_, )
 supermod.DecayingPoolConcentrationModel.subclass = DecayingPoolConcentrationModelSub
 # end class DecayingPoolConcentrationModelSub
 
@@ -584,8 +553,8 @@ supermod.GateHHUndetermined.subclass = GateHHUndeterminedSub
 
 
 class IonChannelSub(supermod.IonChannel):
-    def __init__(self, id=None, neuroLexId=None, name=None, metaid=None, notes=None, annotation=None, conductance=None, type_=None, species=None, gate=None, gateHHrates=None, gateHHratesTau=None, gateHHtauInf=None, gateHHratesInf=None):
-        super(IonChannelSub, self).__init__(id, neuroLexId, name, metaid, notes, annotation, conductance, type_, species, gate, gateHHrates, gateHHratesTau, gateHHtauInf, gateHHratesInf, )
+    def __init__(self, id=None, neuroLexId=None, name=None, metaid=None, notes=None, annotation=None, species=None, type_=None, conductance=None, gate=None, gateHHrates=None, gateHHratesTau=None, gateHHtauInf=None, gateHHratesInf=None):
+        super(IonChannelSub, self).__init__(id, neuroLexId, name, metaid, notes, annotation, species, type_, conductance, gate, gateHHrates, gateHHratesTau, gateHHtauInf, gateHHratesInf, )
 supermod.IonChannel.subclass = IonChannelSub
 # end class IonChannelSub
 
@@ -605,57 +574,57 @@ supermod.BasePynnSynapse.subclass = BasePynnSynapseSub
 
 
 class basePyNNCellSub(supermod.basePyNNCell):
-    def __init__(self, id=None, neuroLexId=None, name=None, metaid=None, notes=None, annotation=None, tau_syn_I=None, tau_syn_E=None, i_offset=None, cm=None, v_init=None, extensiontype_=None):
-        super(basePyNNCellSub, self).__init__(id, neuroLexId, name, metaid, notes, annotation, tau_syn_I, tau_syn_E, i_offset, cm, v_init, extensiontype_, )
+    def __init__(self, id=None, neuroLexId=None, name=None, metaid=None, notes=None, annotation=None, cm=None, i_offset=None, tau_syn_E=None, tau_syn_I=None, v_init=None, extensiontype_=None):
+        super(basePyNNCellSub, self).__init__(id, neuroLexId, name, metaid, notes, annotation, cm, i_offset, tau_syn_E, tau_syn_I, v_init, extensiontype_, )
 supermod.basePyNNCell.subclass = basePyNNCellSub
 # end class basePyNNCellSub
 
 
 class ConcentrationModel_DSub(supermod.ConcentrationModel_D):
-    def __init__(self, id=None, neuroLexId=None, name=None, metaid=None, notes=None, annotation=None, ion=None, shellThickness=None, restingConc=None, decayConstant=None, type_=None):
-        super(ConcentrationModel_DSub, self).__init__(id, neuroLexId, name, metaid, notes, annotation, ion, shellThickness, restingConc, decayConstant, type_, )
+    def __init__(self, id=None, neuroLexId=None, name=None, metaid=None, notes=None, annotation=None, ion=None, restingConc=None, decayConstant=None, shellThickness=None, type_=None):
+        super(ConcentrationModel_DSub, self).__init__(id, neuroLexId, name, metaid, notes, annotation, ion, restingConc, decayConstant, shellThickness, type_, )
 supermod.ConcentrationModel_D.subclass = ConcentrationModel_DSub
 # end class ConcentrationModel_DSub
 
 
 class CellSub(supermod.Cell):
-    def __init__(self, id=None, neuroLexId=None, name=None, metaid=None, notes=None, annotation=None, biophysicalProperties_attr=None, morphology_attr=None, morphology=None, biophysicalProperties=None):
-        super(CellSub, self).__init__(id, neuroLexId, name, metaid, notes, annotation, biophysicalProperties_attr, morphology_attr, morphology, biophysicalProperties, )
+    def __init__(self, id=None, neuroLexId=None, name=None, metaid=None, notes=None, annotation=None, morphology_attr=None, biophysicalProperties_attr=None, morphology=None, biophysicalProperties=None):
+        super(CellSub, self).__init__(id, neuroLexId, name, metaid, notes, annotation, morphology_attr, biophysicalProperties_attr, morphology, biophysicalProperties, )
 supermod.Cell.subclass = CellSub
 # end class CellSub
 
 
 class AdExIaFCellSub(supermod.AdExIaFCell):
-    def __init__(self, id=None, neuroLexId=None, name=None, metaid=None, notes=None, annotation=None, reset=None, EL=None, C=None, b=None, refract=None, VT=None, delT=None, a=None, thresh=None, gL=None, tauw=None):
-        super(AdExIaFCellSub, self).__init__(id, neuroLexId, name, metaid, notes, annotation, reset, EL, C, b, refract, VT, delT, a, thresh, gL, tauw, )
+    def __init__(self, id=None, neuroLexId=None, name=None, metaid=None, notes=None, annotation=None, C=None, gL=None, EL=None, reset=None, VT=None, thresh=None, delT=None, tauw=None, refract=None, a=None, b=None):
+        super(AdExIaFCellSub, self).__init__(id, neuroLexId, name, metaid, notes, annotation, C, gL, EL, reset, VT, thresh, delT, tauw, refract, a, b, )
 supermod.AdExIaFCell.subclass = AdExIaFCellSub
 # end class AdExIaFCellSub
 
 
 class IzhikevichCellSub(supermod.IzhikevichCell):
-    def __init__(self, id=None, neuroLexId=None, name=None, metaid=None, notes=None, annotation=None, a=None, c=None, b=None, d=None, v0=None, thresh=None):
-        super(IzhikevichCellSub, self).__init__(id, neuroLexId, name, metaid, notes, annotation, a, c, b, d, v0, thresh, )
+    def __init__(self, id=None, neuroLexId=None, name=None, metaid=None, notes=None, annotation=None, v0=None, thresh=None, a=None, b=None, c=None, d=None):
+        super(IzhikevichCellSub, self).__init__(id, neuroLexId, name, metaid, notes, annotation, v0, thresh, a, b, c, d, )
 supermod.IzhikevichCell.subclass = IzhikevichCellSub
 # end class IzhikevichCellSub
 
 
 class IaFCellSub(supermod.IaFCell):
-    def __init__(self, id=None, neuroLexId=None, name=None, metaid=None, notes=None, annotation=None, reset=None, C=None, thresh=None, leakConductance=None, leakReversal=None, extensiontype_=None):
-        super(IaFCellSub, self).__init__(id, neuroLexId, name, metaid, notes, annotation, reset, C, thresh, leakConductance, leakReversal, extensiontype_, )
+    def __init__(self, id=None, neuroLexId=None, name=None, metaid=None, notes=None, annotation=None, leakReversal=None, thresh=None, reset=None, C=None, leakConductance=None, extensiontype_=None):
+        super(IaFCellSub, self).__init__(id, neuroLexId, name, metaid, notes, annotation, leakReversal, thresh, reset, C, leakConductance, extensiontype_, )
 supermod.IaFCell.subclass = IaFCellSub
 # end class IaFCellSub
 
 
 class IaFTauCellSub(supermod.IaFTauCell):
-    def __init__(self, id=None, neuroLexId=None, name=None, metaid=None, notes=None, annotation=None, reset=None, tau=None, thresh=None, leakReversal=None, extensiontype_=None):
-        super(IaFTauCellSub, self).__init__(id, neuroLexId, name, metaid, notes, annotation, reset, tau, thresh, leakReversal, extensiontype_, )
+    def __init__(self, id=None, neuroLexId=None, name=None, metaid=None, notes=None, annotation=None, leakReversal=None, thresh=None, reset=None, tau=None, extensiontype_=None):
+        super(IaFTauCellSub, self).__init__(id, neuroLexId, name, metaid, notes, annotation, leakReversal, thresh, reset, tau, extensiontype_, )
 supermod.IaFTauCell.subclass = IaFTauCellSub
 # end class IaFTauCellSub
 
 
 class BaseConductanceBasedSynapseSub(supermod.BaseConductanceBasedSynapse):
-    def __init__(self, id=None, neuroLexId=None, name=None, metaid=None, notes=None, annotation=None, erev=None, gbase=None, extensiontype_=None):
-        super(BaseConductanceBasedSynapseSub, self).__init__(id, neuroLexId, name, metaid, notes, annotation, erev, gbase, extensiontype_, )
+    def __init__(self, id=None, neuroLexId=None, name=None, metaid=None, notes=None, annotation=None, gbase=None, erev=None, extensiontype_=None):
+        super(BaseConductanceBasedSynapseSub, self).__init__(id, neuroLexId, name, metaid, notes, annotation, gbase, erev, extensiontype_, )
 supermod.BaseConductanceBasedSynapse.subclass = BaseConductanceBasedSynapseSub
 # end class BaseConductanceBasedSynapseSub
 
@@ -689,99 +658,99 @@ supermod.ExpCondSynapse.subclass = ExpCondSynapseSub
 
 
 class HH_cond_expSub(supermod.HH_cond_exp):
-    def __init__(self, id=None, neuroLexId=None, name=None, metaid=None, notes=None, annotation=None, tau_syn_I=None, tau_syn_E=None, i_offset=None, cm=None, v_init=None, gbar_K=None, e_rev_E=None, g_leak=None, e_rev_Na=None, e_rev_I=None, e_rev_K=None, e_rev_leak=None, v_offset=None, gbar_Na=None):
-        super(HH_cond_expSub, self).__init__(id, neuroLexId, name, metaid, notes, annotation, tau_syn_I, tau_syn_E, i_offset, cm, v_init, gbar_K, e_rev_E, g_leak, e_rev_Na, e_rev_I, e_rev_K, e_rev_leak, v_offset, gbar_Na, )
+    def __init__(self, id=None, neuroLexId=None, name=None, metaid=None, notes=None, annotation=None, cm=None, i_offset=None, tau_syn_E=None, tau_syn_I=None, v_init=None, v_offset=None, e_rev_E=None, e_rev_I=None, e_rev_K=None, e_rev_Na=None, e_rev_leak=None, g_leak=None, gbar_K=None, gbar_Na=None):
+        super(HH_cond_expSub, self).__init__(id, neuroLexId, name, metaid, notes, annotation, cm, i_offset, tau_syn_E, tau_syn_I, v_init, v_offset, e_rev_E, e_rev_I, e_rev_K, e_rev_Na, e_rev_leak, g_leak, gbar_K, gbar_Na, )
 supermod.HH_cond_exp.subclass = HH_cond_expSub
 # end class HH_cond_expSub
 
 
 class basePyNNIaFCellSub(supermod.basePyNNIaFCell):
-    def __init__(self, id=None, neuroLexId=None, name=None, metaid=None, notes=None, annotation=None, tau_syn_I=None, tau_syn_E=None, i_offset=None, cm=None, v_init=None, tau_refrac=None, v_thresh=None, tau_m=None, v_reset=None, v_rest=None, extensiontype_=None):
-        super(basePyNNIaFCellSub, self).__init__(id, neuroLexId, name, metaid, notes, annotation, tau_syn_I, tau_syn_E, i_offset, cm, v_init, tau_refrac, v_thresh, tau_m, v_reset, v_rest, extensiontype_, )
+    def __init__(self, id=None, neuroLexId=None, name=None, metaid=None, notes=None, annotation=None, cm=None, i_offset=None, tau_syn_E=None, tau_syn_I=None, v_init=None, tau_m=None, tau_refrac=None, v_reset=None, v_rest=None, v_thresh=None, extensiontype_=None):
+        super(basePyNNIaFCellSub, self).__init__(id, neuroLexId, name, metaid, notes, annotation, cm, i_offset, tau_syn_E, tau_syn_I, v_init, tau_m, tau_refrac, v_reset, v_rest, v_thresh, extensiontype_, )
 supermod.basePyNNIaFCell.subclass = basePyNNIaFCellSub
 # end class basePyNNIaFCellSub
 
 
 class IaFRefCellSub(supermod.IaFRefCell):
-    def __init__(self, id=None, neuroLexId=None, name=None, metaid=None, notes=None, annotation=None, reset=None, C=None, thresh=None, leakConductance=None, leakReversal=None, refract=None):
-        super(IaFRefCellSub, self).__init__(id, neuroLexId, name, metaid, notes, annotation, reset, C, thresh, leakConductance, leakReversal, refract, )
+    def __init__(self, id=None, neuroLexId=None, name=None, metaid=None, notes=None, annotation=None, leakReversal=None, thresh=None, reset=None, C=None, leakConductance=None, refract=None):
+        super(IaFRefCellSub, self).__init__(id, neuroLexId, name, metaid, notes, annotation, leakReversal, thresh, reset, C, leakConductance, refract, )
 supermod.IaFRefCell.subclass = IaFRefCellSub
 # end class IaFRefCellSub
 
 
 class IaFTauRefCellSub(supermod.IaFTauRefCell):
-    def __init__(self, id=None, neuroLexId=None, name=None, metaid=None, notes=None, annotation=None, reset=None, tau=None, thresh=None, leakReversal=None, refract=None):
-        super(IaFTauRefCellSub, self).__init__(id, neuroLexId, name, metaid, notes, annotation, reset, tau, thresh, leakReversal, refract, )
+    def __init__(self, id=None, neuroLexId=None, name=None, metaid=None, notes=None, annotation=None, leakReversal=None, thresh=None, reset=None, tau=None, refract=None):
+        super(IaFTauRefCellSub, self).__init__(id, neuroLexId, name, metaid, notes, annotation, leakReversal, thresh, reset, tau, refract, )
 supermod.IaFTauRefCell.subclass = IaFTauRefCellSub
 # end class IaFTauRefCellSub
 
 
 class ExpTwoSynapseSub(supermod.ExpTwoSynapse):
-    def __init__(self, id=None, neuroLexId=None, name=None, metaid=None, notes=None, annotation=None, erev=None, gbase=None, tauDecay=None, tauRise=None, extensiontype_=None):
-        super(ExpTwoSynapseSub, self).__init__(id, neuroLexId, name, metaid, notes, annotation, erev, gbase, tauDecay, tauRise, extensiontype_, )
+    def __init__(self, id=None, neuroLexId=None, name=None, metaid=None, notes=None, annotation=None, gbase=None, erev=None, tauDecay=None, tauRise=None, extensiontype_=None):
+        super(ExpTwoSynapseSub, self).__init__(id, neuroLexId, name, metaid, notes, annotation, gbase, erev, tauDecay, tauRise, extensiontype_, )
 supermod.ExpTwoSynapse.subclass = ExpTwoSynapseSub
 # end class ExpTwoSynapseSub
 
 
 class ExpOneSynapseSub(supermod.ExpOneSynapse):
-    def __init__(self, id=None, neuroLexId=None, name=None, metaid=None, notes=None, annotation=None, erev=None, gbase=None, tauDecay=None):
-        super(ExpOneSynapseSub, self).__init__(id, neuroLexId, name, metaid, notes, annotation, erev, gbase, tauDecay, )
+    def __init__(self, id=None, neuroLexId=None, name=None, metaid=None, notes=None, annotation=None, gbase=None, erev=None, tauDecay=None):
+        super(ExpOneSynapseSub, self).__init__(id, neuroLexId, name, metaid, notes, annotation, gbase, erev, tauDecay, )
 supermod.ExpOneSynapse.subclass = ExpOneSynapseSub
 # end class ExpOneSynapseSub
 
 
 class IF_curr_expSub(supermod.IF_curr_exp):
-    def __init__(self, id=None, neuroLexId=None, name=None, metaid=None, notes=None, annotation=None, tau_syn_I=None, tau_syn_E=None, i_offset=None, cm=None, v_init=None, tau_refrac=None, v_thresh=None, tau_m=None, v_reset=None, v_rest=None):
-        super(IF_curr_expSub, self).__init__(id, neuroLexId, name, metaid, notes, annotation, tau_syn_I, tau_syn_E, i_offset, cm, v_init, tau_refrac, v_thresh, tau_m, v_reset, v_rest, )
+    def __init__(self, id=None, neuroLexId=None, name=None, metaid=None, notes=None, annotation=None, cm=None, i_offset=None, tau_syn_E=None, tau_syn_I=None, v_init=None, tau_m=None, tau_refrac=None, v_reset=None, v_rest=None, v_thresh=None):
+        super(IF_curr_expSub, self).__init__(id, neuroLexId, name, metaid, notes, annotation, cm, i_offset, tau_syn_E, tau_syn_I, v_init, tau_m, tau_refrac, v_reset, v_rest, v_thresh, )
 supermod.IF_curr_exp.subclass = IF_curr_expSub
 # end class IF_curr_expSub
 
 
 class IF_curr_alphaSub(supermod.IF_curr_alpha):
-    def __init__(self, id=None, neuroLexId=None, name=None, metaid=None, notes=None, annotation=None, tau_syn_I=None, tau_syn_E=None, i_offset=None, cm=None, v_init=None, tau_refrac=None, v_thresh=None, tau_m=None, v_reset=None, v_rest=None):
-        super(IF_curr_alphaSub, self).__init__(id, neuroLexId, name, metaid, notes, annotation, tau_syn_I, tau_syn_E, i_offset, cm, v_init, tau_refrac, v_thresh, tau_m, v_reset, v_rest, )
+    def __init__(self, id=None, neuroLexId=None, name=None, metaid=None, notes=None, annotation=None, cm=None, i_offset=None, tau_syn_E=None, tau_syn_I=None, v_init=None, tau_m=None, tau_refrac=None, v_reset=None, v_rest=None, v_thresh=None):
+        super(IF_curr_alphaSub, self).__init__(id, neuroLexId, name, metaid, notes, annotation, cm, i_offset, tau_syn_E, tau_syn_I, v_init, tau_m, tau_refrac, v_reset, v_rest, v_thresh, )
 supermod.IF_curr_alpha.subclass = IF_curr_alphaSub
 # end class IF_curr_alphaSub
 
 
 class basePyNNIaFCondCellSub(supermod.basePyNNIaFCondCell):
-    def __init__(self, id=None, neuroLexId=None, name=None, metaid=None, notes=None, annotation=None, tau_syn_I=None, tau_syn_E=None, i_offset=None, cm=None, v_init=None, tau_refrac=None, v_thresh=None, tau_m=None, v_reset=None, v_rest=None, e_rev_I=None, e_rev_E=None, extensiontype_=None):
-        super(basePyNNIaFCondCellSub, self).__init__(id, neuroLexId, name, metaid, notes, annotation, tau_syn_I, tau_syn_E, i_offset, cm, v_init, tau_refrac, v_thresh, tau_m, v_reset, v_rest, e_rev_I, e_rev_E, extensiontype_, )
+    def __init__(self, id=None, neuroLexId=None, name=None, metaid=None, notes=None, annotation=None, cm=None, i_offset=None, tau_syn_E=None, tau_syn_I=None, v_init=None, tau_m=None, tau_refrac=None, v_reset=None, v_rest=None, v_thresh=None, e_rev_E=None, e_rev_I=None, extensiontype_=None):
+        super(basePyNNIaFCondCellSub, self).__init__(id, neuroLexId, name, metaid, notes, annotation, cm, i_offset, tau_syn_E, tau_syn_I, v_init, tau_m, tau_refrac, v_reset, v_rest, v_thresh, e_rev_E, e_rev_I, extensiontype_, )
 supermod.basePyNNIaFCondCell.subclass = basePyNNIaFCondCellSub
 # end class basePyNNIaFCondCellSub
 
 
 class BlockingPlasticSynapseSub(supermod.BlockingPlasticSynapse):
-    def __init__(self, id=None, neuroLexId=None, name=None, metaid=None, notes=None, annotation=None, erev=None, gbase=None, tauDecay=None, tauRise=None, plasticityMechanism=None, blockMechanism=None):
-        super(BlockingPlasticSynapseSub, self).__init__(id, neuroLexId, name, metaid, notes, annotation, erev, gbase, tauDecay, tauRise, plasticityMechanism, blockMechanism, )
+    def __init__(self, id=None, neuroLexId=None, name=None, metaid=None, notes=None, annotation=None, gbase=None, erev=None, tauDecay=None, tauRise=None, plasticityMechanism=None, blockMechanism=None):
+        super(BlockingPlasticSynapseSub, self).__init__(id, neuroLexId, name, metaid, notes, annotation, gbase, erev, tauDecay, tauRise, plasticityMechanism, blockMechanism, )
 supermod.BlockingPlasticSynapse.subclass = BlockingPlasticSynapseSub
 # end class BlockingPlasticSynapseSub
 
 
 class EIF_cond_alpha_isfa_istaSub(supermod.EIF_cond_alpha_isfa_ista):
-    def __init__(self, id=None, neuroLexId=None, name=None, metaid=None, notes=None, annotation=None, tau_syn_I=None, tau_syn_E=None, i_offset=None, cm=None, v_init=None, tau_refrac=None, v_thresh=None, tau_m=None, v_reset=None, v_rest=None, e_rev_I=None, e_rev_E=None, a=None, delta_T=None, b=None, v_spike=None, tau_w=None):
-        super(EIF_cond_alpha_isfa_istaSub, self).__init__(id, neuroLexId, name, metaid, notes, annotation, tau_syn_I, tau_syn_E, i_offset, cm, v_init, tau_refrac, v_thresh, tau_m, v_reset, v_rest, e_rev_I, e_rev_E, a, delta_T, b, v_spike, tau_w, )
+    def __init__(self, id=None, neuroLexId=None, name=None, metaid=None, notes=None, annotation=None, cm=None, i_offset=None, tau_syn_E=None, tau_syn_I=None, v_init=None, tau_m=None, tau_refrac=None, v_reset=None, v_rest=None, v_thresh=None, e_rev_E=None, e_rev_I=None, a=None, b=None, delta_T=None, tau_w=None, v_spike=None):
+        super(EIF_cond_alpha_isfa_istaSub, self).__init__(id, neuroLexId, name, metaid, notes, annotation, cm, i_offset, tau_syn_E, tau_syn_I, v_init, tau_m, tau_refrac, v_reset, v_rest, v_thresh, e_rev_E, e_rev_I, a, b, delta_T, tau_w, v_spike, )
 supermod.EIF_cond_alpha_isfa_ista.subclass = EIF_cond_alpha_isfa_istaSub
 # end class EIF_cond_alpha_isfa_istaSub
 
 
 class EIF_cond_exp_isfa_istaSub(supermod.EIF_cond_exp_isfa_ista):
-    def __init__(self, id=None, neuroLexId=None, name=None, metaid=None, notes=None, annotation=None, tau_syn_I=None, tau_syn_E=None, i_offset=None, cm=None, v_init=None, tau_refrac=None, v_thresh=None, tau_m=None, v_reset=None, v_rest=None, e_rev_I=None, e_rev_E=None, a=None, delta_T=None, b=None, v_spike=None, tau_w=None):
-        super(EIF_cond_exp_isfa_istaSub, self).__init__(id, neuroLexId, name, metaid, notes, annotation, tau_syn_I, tau_syn_E, i_offset, cm, v_init, tau_refrac, v_thresh, tau_m, v_reset, v_rest, e_rev_I, e_rev_E, a, delta_T, b, v_spike, tau_w, )
+    def __init__(self, id=None, neuroLexId=None, name=None, metaid=None, notes=None, annotation=None, cm=None, i_offset=None, tau_syn_E=None, tau_syn_I=None, v_init=None, tau_m=None, tau_refrac=None, v_reset=None, v_rest=None, v_thresh=None, e_rev_E=None, e_rev_I=None, a=None, b=None, delta_T=None, tau_w=None, v_spike=None):
+        super(EIF_cond_exp_isfa_istaSub, self).__init__(id, neuroLexId, name, metaid, notes, annotation, cm, i_offset, tau_syn_E, tau_syn_I, v_init, tau_m, tau_refrac, v_reset, v_rest, v_thresh, e_rev_E, e_rev_I, a, b, delta_T, tau_w, v_spike, )
 supermod.EIF_cond_exp_isfa_ista.subclass = EIF_cond_exp_isfa_istaSub
 # end class EIF_cond_exp_isfa_istaSub
 
 
 class IF_cond_expSub(supermod.IF_cond_exp):
-    def __init__(self, id=None, neuroLexId=None, name=None, metaid=None, notes=None, annotation=None, tau_syn_I=None, tau_syn_E=None, i_offset=None, cm=None, v_init=None, tau_refrac=None, v_thresh=None, tau_m=None, v_reset=None, v_rest=None, e_rev_I=None, e_rev_E=None):
-        super(IF_cond_expSub, self).__init__(id, neuroLexId, name, metaid, notes, annotation, tau_syn_I, tau_syn_E, i_offset, cm, v_init, tau_refrac, v_thresh, tau_m, v_reset, v_rest, e_rev_I, e_rev_E, )
+    def __init__(self, id=None, neuroLexId=None, name=None, metaid=None, notes=None, annotation=None, cm=None, i_offset=None, tau_syn_E=None, tau_syn_I=None, v_init=None, tau_m=None, tau_refrac=None, v_reset=None, v_rest=None, v_thresh=None, e_rev_E=None, e_rev_I=None):
+        super(IF_cond_expSub, self).__init__(id, neuroLexId, name, metaid, notes, annotation, cm, i_offset, tau_syn_E, tau_syn_I, v_init, tau_m, tau_refrac, v_reset, v_rest, v_thresh, e_rev_E, e_rev_I, )
 supermod.IF_cond_exp.subclass = IF_cond_expSub
 # end class IF_cond_expSub
 
 
 class IF_cond_alphaSub(supermod.IF_cond_alpha):
-    def __init__(self, id=None, neuroLexId=None, name=None, metaid=None, notes=None, annotation=None, tau_syn_I=None, tau_syn_E=None, i_offset=None, cm=None, v_init=None, tau_refrac=None, v_thresh=None, tau_m=None, v_reset=None, v_rest=None, e_rev_I=None, e_rev_E=None):
-        super(IF_cond_alphaSub, self).__init__(id, neuroLexId, name, metaid, notes, annotation, tau_syn_I, tau_syn_E, i_offset, cm, v_init, tau_refrac, v_thresh, tau_m, v_reset, v_rest, e_rev_I, e_rev_E, )
+    def __init__(self, id=None, neuroLexId=None, name=None, metaid=None, notes=None, annotation=None, cm=None, i_offset=None, tau_syn_E=None, tau_syn_I=None, v_init=None, tau_m=None, tau_refrac=None, v_reset=None, v_rest=None, v_thresh=None, e_rev_E=None, e_rev_I=None):
+        super(IF_cond_alphaSub, self).__init__(id, neuroLexId, name, metaid, notes, annotation, cm, i_offset, tau_syn_E, tau_syn_I, v_init, tau_m, tau_refrac, v_reset, v_rest, v_thresh, e_rev_E, e_rev_I, )
 supermod.IF_cond_alpha.subclass = IF_cond_alphaSub
 # end class IF_cond_alphaSub
 
@@ -795,8 +764,9 @@ def get_root_tag(node):
     return tag, rootClass
 
 
-def parse(inFilename):
-    doc = parsexml_(inFilename)
+def parse(inFilename, silence=False):
+    parser = None
+    doc = parsexml_(inFilename, parser)
     rootNode = doc.getroot()
     rootTag, rootClass = get_root_tag(rootNode)
     if rootClass is None:
@@ -806,16 +776,18 @@ def parse(inFilename):
     rootObj.build(rootNode)
     # Enable Python to collect the space used by the DOM.
     doc = None
-##     sys.stdout.write('<?xml version="1.0" ?>\n')
-##     rootObj.export(
-##         sys.stdout, 0, name_=rootTag,
-##         namespacedef_='',
-##         pretty_print=True)
+    if not silence:
+        sys.stdout.write('<?xml version="1.0" ?>\n')
+        rootObj.export(
+            sys.stdout, 0, name_=rootTag,
+            namespacedef_='',
+            pretty_print=True)
     return rootObj
 
 
-def parseEtree(inFilename):
-    doc = parsexml_(inFilename)
+def parseEtree(inFilename, silence=False):
+    parser = None
+    doc = parsexml_(inFilename, parser)
     rootNode = doc.getroot()
     rootTag, rootClass = get_root_tag(rootNode)
     if rootClass is None:
@@ -828,17 +800,19 @@ def parseEtree(inFilename):
     mapping = {}
     rootElement = rootObj.to_etree(None, name_=rootTag, mapping_=mapping)
     reverse_mapping = rootObj.gds_reverse_node_mapping(mapping)
-##     content = etree_.tostring(
-##         rootElement, pretty_print=True,
-##         xml_declaration=True, encoding="utf-8")
-##     sys.stdout.write(content)
-##     sys.stdout.write('\n')
+    if not silence:
+        content = etree_.tostring(
+            rootElement, pretty_print=True,
+            xml_declaration=True, encoding="utf-8")
+        sys.stdout.write(content)
+        sys.stdout.write('\n')
     return rootObj, rootElement, mapping, reverse_mapping
 
 
-def parseString(inString):
-    from io import StringIO
-    doc = parsexml_(StringIO(inString))
+def parseString(inString, silence=False):
+    from StringIO import StringIO
+    parser = None
+    doc = parsexml_(StringIO(inString), parser)
     rootNode = doc.getroot()
     rootTag, rootClass = get_root_tag(rootNode)
     if rootClass is None:
@@ -848,29 +822,32 @@ def parseString(inString):
     rootObj.build(rootNode)
     # Enable Python to collect the space used by the DOM.
     doc = None
-##     sys.stdout.write('<?xml version="1.0" ?>\n')
-##     rootObj.export(
-##         sys.stdout, 0, name_=rootTag,
-##         namespacedef_='')
+    if not silence:
+        sys.stdout.write('<?xml version="1.0" ?>\n')
+        rootObj.export(
+            sys.stdout, 0, name_=rootTag,
+            namespacedef_='')
     return rootObj
 
 
-def parseLiteral(inFilename):
-    doc = parsexml_(inFilename)
+def parseLiteral(inFilename, silence=False):
+    parser = None
+    doc = parsexml_(inFilename, parser)
     rootNode = doc.getroot()
-    roots = get_root_tag(rootNode)
-    rootClass = roots[1]
+    rootTag, rootClass = get_root_tag(rootNode)
     if rootClass is None:
+        rootTag = 'Annotation'
         rootClass = supermod.Annotation
     rootObj = rootClass.factory()
     rootObj.build(rootNode)
     # Enable Python to collect the space used by the DOM.
     doc = None
-##     sys.stdout.write('#from ??? import *\n\n')
-##     sys.stdout.write('import ??? as model_\n\n')
-##     sys.stdout.write('rootObj = model_.Annotation(\n')
-##     rootObj.exportLiteral(sys.stdout, 0, name_="Annotation")
-##     sys.stdout.write(')\n')
+    if not silence:
+        sys.stdout.write('#from ??? import *\n\n')
+        sys.stdout.write('import ??? as model_\n\n')
+        sys.stdout.write('rootObj = model_.rootClass(\n')
+        rootObj.exportLiteral(sys.stdout, 0, name_=rootTag)
+        sys.stdout.write(')\n')
     return rootObj
 
 
diff --git a/python/moose/neuroml2/test_files/NML2_FullCell.nml b/python/moose/neuroml2/test_files/NML2_FullCell.nml
index e45c9116..b6937504 100644
--- a/python/moose/neuroml2/test_files/NML2_FullCell.nml
+++ b/python/moose/neuroml2/test_files/NML2_FullCell.nml
@@ -3,15 +3,14 @@
 <neuroml xmlns="http://www.neuroml.org/schema/neuroml2"
     xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
     xmlns:xi="http://www.w3.org/2001/XInclude"
-      xsi:schemaLocation="http://www.neuroml.org/schema/neuroml2 http://neuroml.svn.sourceforge.net/viewvc/neuroml/NeuroML2/Schemas/NeuroML2/NeuroML_v2alpha.xsd"
+      xsi:schemaLocation="http://www.neuroml.org/schema/neuroml2 ../Schemas/NeuroML2/NeuroML_v2beta3.xsd"
     id="NML2_FullCell">
         
 
 <!-- Example of a multicompartmental cell with biophysics in NeuroML 2 -->
 
-<!-- This is a "pure" NeuroML 2 file. It cannot yet be used a simulation by the LEMS 
-     Interpreter as this does not yet support multicompartment cells -->   
-
+<!-- This is a "pure" NeuroML 2 file. It cannot currently used for simulations with 
+     jLEMS/jNeuroML however, as jLEMS does not yet support multicompartmental cells -->    
 
     <include href="SimpleIonChannel.xml"/> <!-- Contains ionChannel NaConductance -->
 
@@ -25,12 +24,12 @@
         <annotation>
             <rdf:RDF xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:bqbiol="http://biomodels.net/biology-qualifiers/">
               <rdf:Description rdf:about="HippoCA1Cell">
-                <bqbiol:isVersionOf>
+                <bqbiol:is>
                   <rdf:Bag>
                     <!-- This cell model is a version of a hippocampal CA1 pyramidal cell -->
                     <rdf:li rdf:resource="urn:miriam:neurondb:258"/>
                   </rdf:Bag>
-                </bqbiol:isVersionOf>
+                </bqbiol:is>
               </rdf:Description>
             </rdf:RDF>
         </annotation>
@@ -82,27 +81,20 @@
 
             <membraneProperties> 
 
-                <channelPopulation id="naChansDend" ionChannel="NaConductance" segment="2" number="120000"/>   <!-- Use population instead of density -->
+                <channelPopulation id="naChansDend" ionChannel="NaConductance" segment="2" number="120000" erev="50mV"/>   <!-- Use population instead of density -->
 
-                <channelDensity id="pasChans" ionChannel="pas" condDensity="3.0 S_per_m2"/> <!-- no segmentGroup => all segments! -->
+                <channelDensity id="pasChans" ionChannel="pas" condDensity="3.0 S_per_m2" erev="-70mV"/> <!-- no segmentGroup => all segments! -->
 
-                <channelDensity id="naChansSoma" ionChannel="NaConductance" segmentGroup="soma_group" condDensity="120.0 mS_per_cm2"/>
+                <channelDensity id="naChansSoma" ionChannel="NaConductance" segmentGroup="soma_group" condDensity="120.0 mS_per_cm2" erev="50mV"/>
 
                 <specificCapacitance segmentGroup="soma_group" value="1.0 uF_per_cm2"/>
 
                 <specificCapacitance segmentGroup="dendrite_group" value="2.0 uF_per_cm2"/>
 
-                <reversalPotential species="na" value="55mV"/>
-
             </membraneProperties>
 
             <intracellularProperties>
 
-                <!-- Ions present inside the cell. -->
-                <species id="ca">
-                    <fixedConcentration  concentration="1e-5 mM"/>
-                </species>
-
                 <resistivity value="0.1 kohm_cm"/>  <!-- Used for specific axial resistance -->
 
             </intracellularProperties>
diff --git a/python/moose/neuroml2/test_files/SimpleIonChannel.xml b/python/moose/neuroml2/test_files/SimpleIonChannel.xml
index ea7c8cd7..354cf2eb 100644
--- a/python/moose/neuroml2/test_files/SimpleIonChannel.xml
+++ b/python/moose/neuroml2/test_files/SimpleIonChannel.xml
@@ -1,15 +1,24 @@
-<neuroml xmlns="http://www.neuroml.org/schema/neuroml2"  xmlns:xi="http://www.w3.org/2001/XInclude" xmlns:xs="http://www.w3.org/2001/XMLSchema" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://www.w3.org/2001/XMLSchema" id="ChannelMLDemo">
-    <ionChannel id="NaConductance" conductance="10pS" type="ionChannelHH" species="na">
-        <notes>This is an example voltage-gated Na channel</notes>
-        <gate id="m" instances="3">
-            <forwardRate midpoint="-65mV" rate="0.07per_ms" scale="-20mV" type="HHExpRate"/>
-            <reverseRate midpoint="-35mV" rate="1per_ms" scale="10mV" type="HHSigmoidRate"/>
-        </gate>
-        <gate id="h" instances="1">
-            <forwardRate midpoint="-55mV" rate="0.1per_ms" scale="10mV" type="HHExpLinearRate"/>
-            <reverseRate midpoint="-65mV" rate="0.125per_ms" scale="-80mV" type="HHExpRate"/>
-        </gate>
-    </ionChannel>
-    <ionChannel id="pas" type="ionChannelPassive">
-    </ionChannel>
+<?xml version="1.0" encoding="UTF-8"?>
+
+<neuroml xmlns="http://www.neuroml.org/schema/neuroml2"
+         xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
+         xsi:schemaLocation="http://www.neuroml.org/schema/neuroml2  ../Schemas/NeuroML2/NeuroML_v2beta3.xsd"
+         id="NML2_SimpleIonChannel">  
+
+    <!-- Example of a simple Na+ ion channel in NeuroML 2 -->
+    
+    <ionChannelHH id="na" conductance="10pS" species="na">
+
+        <gateHHrates id="m" instances="3">
+            <forwardRate type="HHExpLinearRate" rate="1per_ms" midpoint="-40mV" scale="10mV"/>
+            <reverseRate type="HHExpRate" rate="4per_ms" midpoint="-65mV" scale="-18mV"/>
+        </gateHHrates>
+
+        <gateHHrates id="h" instances="1">
+            <forwardRate type="HHExpRate" rate="0.07per_ms" midpoint="-65mV" scale="-20mV"/>
+            <reverseRate type="HHSigmoidRate" rate="1per_ms" midpoint="-35mV" scale="10mV"/>
+        </gateHHrates>
+
+    </ionChannelHH>
+
 </neuroml>
diff --git a/python/moose/writekkit.py b/python/moose/writekkit.py
deleted file mode 100644
index f23e4968..00000000
--- a/python/moose/writekkit.py
+++ /dev/null
@@ -1,506 +0,0 @@
-import sys
-import random
-from . import wildcardFind, element, loadModel, ChemCompt, exists, Annotator, Pool, ZombiePool,PoolBase,CplxEnzBase,Function,ZombieFunction
-import numpy as np
-
-#Todo : To be written
-#               --Notes
-#               --StimulusTable
-
-def writeKkit( modelpath, filename,sceneitems=None):
-        global NA
-        NA = 6.0221415e23
-        global xmin,xmax,ymin,ymax
-        global cord
-        global multi
-        xmin = ymin = 0
-        xmax = ymax = 1
-        multi = 50
-        cord = {}
-        compt = wildcardFind(modelpath+'/##[ISA=ChemCompt]')
-        maxVol = estimateDefaultVol(compt)
-        f = open(filename, 'w')
-        writeHeader (f,maxVol)
-        if (compt > 0):
-                if sceneitems == None:
-                        #if sceneitems is none (loaded from script) then check x,y cord exists
-                        xmin,ymin,xmax,ymax,positionInfoExist = getCor(modelpath,sceneitems)
-                        if not positionInfoExist:
-                                #incase of SBML or cspace or python Annotator is not populated then positionInfoExist= False
-                                #print " x and y cordinates doesn't exist so auto cordinates"
-                                print(" auto co-ordinates needs to be applied")
-                                pass
-                else:
-                        #This is when it comes from Gui where the objects are already layout on to scene
-                        # so using thoes co-ordinates
-                        xmin,ymin,xmax,ymax,positionInfoExist = getCor(modelpath,sceneitems)
-
-                gtId_vol = writeCompartment(modelpath,compt,f)
-                writePool(modelpath,f,gtId_vol)
-                reacList = writeReac(modelpath,f)
-                enzList = writeEnz(modelpath,f)
-                writeSumtotal(modelpath,f)
-                storeReacMsg(reacList,f)
-                storeEnzMsg(enzList,f)
-                writeGui(f)
-                tgraphs = wildcardFind(modelpath+'/##[ISA=Table2]')
-                if tgraphs:
-                        writeplot(tgraphs,f)
-                        storePlotMsgs(tgraphs,f)
-                writeFooter(f)
-                return True
-        else:
-                print("Warning: writeKkit:: No model found on " , modelpath)
-                return False
-
-def storeCplxEnzMsgs( enz, f ):
-        for sub in enz.neighbors["subOut"]:
-                s = "addmsg /kinetics/" + trimPath( sub ) + " /kinetics/" + trimPath(enz) + " SUBSTRATE n \n";
-                s = s+ "addmsg /kinetics/" + trimPath( enz ) + " /kinetics/" + trimPath( sub ) +        " REAC sA B \n";
-                f.write(s)
-        for prd in enz.neighbors["prd"]:
-                s = "addmsg /kinetics/" + trimPath( enz ) + " /kinetics/" + trimPath(prd) + " MM_PRD pA\n";
-                f.write( s )
-        for enzOut in enz.neighbors["enzOut"]:
-                s = "addmsg /kinetics/" + trimPath( enzOut ) + " /kinetics/" + trimPath(enz) + " ENZYME n\n";
-                s = s+ "addmsg /kinetics/" + trimPath( enz ) + " /kinetics/" + trimPath(enzOut) + " REAC eA B\n";
-                f.write( s )
-
-def storeMMenzMsgs( enz, f):
-        subList = enz.neighbors["subOut"]
-        prdList = enz.neighbors["prd"]
-        enzDestList = enz.neighbors["enzDest"]
-        for esub in subList:
-                es = "addmsg /kinetics/" + trimPath(element(esub)) + " /kinetics/" + trimPath(enz) + " SUBSTRATE n \n";
-                es = es+"addmsg /kinetics/" + trimPath(enz) + " /kinetics/" + trimPath(element(esub)) + " REAC sA B \n";
-                f.write(es)
-
-        for eprd in prdList:
-                es = "addmsg /kinetics/" + trimPath( enz ) + " /kinetics/" + trimPath( element(eprd)) + " MM_PRD pA \n";
-                f.write(es)
-        for eenzDest in enzDestList:
-                enzDest = "addmsg /kinetics/" + trimPath( element(eenzDest)) + " /kinetics/" + trimPath( enz ) + " ENZYME n \n";
-                f.write(enzDest)
-
-def storeEnzMsg( enzList, f):
-        for enz in enzList:
-                enzClass = enz.className
-                if (enzClass == "ZombieMMenz" or enzClass == "MMenz"):
-                        storeMMenzMsgs(enz, f)
-                else:
-                        storeCplxEnzMsgs( enz, f )
-
-def writeEnz( modelpath,f):
-        enzList = wildcardFind(modelpath+'/##[ISA=EnzBase]')
-        for enz in enzList:
-                x = random.randrange(0,10)
-                y = random.randrange(0,10)
-                textcolor = "green"
-                color = "red"
-                k1 = 0;
-                k2 = 0;
-                k3 = 0;
-                nInit = 0;
-                concInit = 0;
-                n = 0;
-                conc = 0;
-                enzParent = enz.parent
-                if (isinstance(enzParent.className,Pool)) or (isinstance(enzParent.className,ZombiePool)):
-                        print(" raise exception enz doesn't have pool as parent")
-                        return False
-                else:
-                        vol = enzParent.volume * NA * 1e-3;
-                        isMichaelisMenten = 0;
-                        enzClass = enz.className
-                        if (enzClass == "ZombieMMenz" or enzClass == "MMenz"):
-                                k1 = enz.numKm
-                                k3 = enz.kcat
-                                k2 = 4.0*k3;
-                                k1 = (k2 + k3) / k1;
-                                isMichaelisMenten = 1;
-
-                        elif (enzClass == "ZombieEnz" or enzClass == "Enz"):
-                                k1 = enz.k1
-                                k2 = enz.k2
-                                k3 = enz.k3
-                                cplx = enz.neighbors['cplx'][0]
-                                nInit = cplx.nInit[0];
-
-                        xe = cord[enz]['x']
-                        ye = cord[enz]['y']
-                        x = ((xe-xmin)/(xmax-xmin))*multi
-                        y = ((ye-ymin)/(ymax-ymin))*multi
-                        #y = ((ymax-ye)/(ymax-ymin))*multi
-                        einfo = enz.path+'/info'
-                        if exists(einfo):
-                                color = Annotator(einfo).getField('color')
-                                textcolor = Annotator(einfo).getField('textColor')
-                        f.write("simundump kenz /kinetics/" + trimPath(enz) + " " + str(0)+  " " +
-                                        str(concInit) + " " +
-                                        str(conc) + " " +
-                                        str(nInit) + " " +
-                                        str(n) + " " +
-                                        str(vol) + " " +
-                                        str(k1) + " " +
-                                        str(k2) + " " +
-                                        str(k3) + " " +
-                                        str(0) + " " +
-                                        str(isMichaelisMenten) + " " +
-                                        "\"\"" + " " +
-                                        str(color) + " " + str(textcolor) + " \"\"" +
-                                        " " + str(x) + " " + str(y) + " "+str(0)+"\n")
-        return enzList
-def storeReacMsg(reacList,f):
-        for reac in reacList:
-                reacPath = trimPath( reac);
-                sublist = reac.neighbors["subOut"]
-                prdlist = reac.neighbors["prd"]
-                for sub in sublist:
-                        s = "addmsg /kinetics/" + trimPath( sub ) + " /kinetics/" + reacPath +  " SUBSTRATE n \n";
-                        s =  s + "addmsg /kinetics/" + reacPath + " /kinetics/" + trimPath( sub ) +  " REAC A B \n";
-                        f.write(s)
-
-                for prd in prdlist:
-                        s = "addmsg /kinetics/" + trimPath( prd ) + " /kinetics/" + reacPath + " PRODUCT n \n";
-                        s = s + "addmsg /kinetics/" + reacPath + " /kinetics/" + trimPath( prd ) +  " REAC B A\n";
-                        f.write( s)
-
-def writeReac(modelpath,f):
-        reacList = wildcardFind(modelpath+'/##[ISA=ReacBase]')
-        for reac in reacList :
-                color = "blue"
-                textcolor = "red"
-                kf = reac.numKf
-                kb = reac.numKb
-                xr = cord[reac]['x']
-                yr = cord[reac]['y']
-                x = ((xr-xmin)/(xmax-xmin))*multi
-                y = ((yr-ymin)/(ymax-ymin))*multi
-                #y = ((ymax-yr)/(ymax-ymin))*multi
-                rinfo = reac.path+'/info'
-                if exists(rinfo):
-                        color = Annotator(rinfo).getField('color')
-                        textcolor = Annotator(rinfo).getField('textColor')
-                f.write("simundump kreac /kinetics/" + trimPath(reac) + " " +str(0) +" "+ str(kf) + " " + str(kb) + " \"\" " +
-                        str(color) + " " + str(textcolor) + " " + str(x) + " " + str(y) + " 0\n")
-        return reacList
-
-def trimPath(mobj):
-        original = mobj
-        mobj = element(mobj)
-        found = False
-        while not isinstance(mobj,ChemCompt) and mobj.path != "/":
-                mobj = element(mobj.parent)
-                found = True
-        if mobj.path == "/":
-                print("compartment is not found with the given path and the path has reached root ",original)
-                return
-        #other than the kinetics compartment, all the othername are converted to group in Genesis which are place under /kinetics
-        # Any moose object comes under /kinetics then one level down the path is taken.
-        # e.g /group/poolObject or /Reac
-        if found:
-                if mobj.name != "kinetics":
-                        splitpath = original.path[(original.path.find(mobj.name)):len(original.path)]
-                else:
-
-                        pos = original.path.find(mobj.name)
-                        slash = original.path.find('/',pos+1)
-                        splitpath = original.path[slash+1:len(original.path)]
-                return splitpath
-
-def writeSumtotal( modelpath,f):
-        funclist = wildcardFind(modelpath+'/##[ISA=Function]')
-        for func in funclist:
-                funcInputs = element(func.path+'/x[0]')
-                s = ""
-                for funcInput in funcInputs.neighbors["input"]:
-                        s = s+ "addmsg /kinetics/" + trimPath(funcInput)+ " /kinetics/" + trimPath(element(func.parent)) + " SUMTOTAL n nInit\n"
-                f.write(s)
-
-def storePlotMsgs( tgraphs,f):
-        s = ""
-        if tgraphs:
-                for graph in tgraphs:
-                        slash = graph.path.find('graphs')
-                        if not slash > -1:
-                                slash = graph.path.find('graph_0')
-                        if slash > -1:
-                                conc = graph.path.find('conc')
-                                if conc > -1 :
-                                        tabPath = graph.path[slash:len(graph.path)]
-                                else:
-                                        slash1 = graph.path.find('/',slash)
-                                        tabPath = "graphs/conc1" +graph.path[slash1:len(graph.path)]
-
-                                if len(element(graph).msgOut):
-                                        poolPath = (element(graph).msgOut)[0].e2.path
-                                        poolEle = element(poolPath)
-                                        poolName = poolEle.name
-                                        bgPath = (poolEle.path+'/info')
-                                        bg = Annotator(bgPath).color
-                                        s = s+"addmsg /kinetics/" + trimPath( poolEle ) + " /" + tabPath + \
-                                                " PLOT Co *" + poolName + " *" + bg +"\n";
-        f.write(s)
-
-def writeplot( tgraphs,f ):
-        if tgraphs:
-                for graphs in tgraphs:
-                        slash = graphs.path.find('graphs')
-                        if not slash > -1:
-                                slash = graphs.path.find('graph_0')
-                        if slash > -1:
-                                conc = graphs.path.find('conc')
-                                if conc > -1 :
-                                        tabPath = graphs.path[slash:len(graphs.path)]
-                                else:
-                                        slash1 = graphs.path.find('/',slash)
-                                        tabPath = "graphs/conc1" +graphs.path[slash1:len(graphs.path)]
-
-                                if len(element(graphs).msgOut):
-                                        poolPath = (element(graphs).msgOut)[0].e2.path
-                                        poolEle = element(poolPath)
-                                        poolAnno = (poolEle.path+'/info')
-                                        fg = Annotator(poolAnno).textColor
-                                        f.write("simundump xplot " + tabPath + " 3 524288 \\\n" + "\"delete_plot.w <s> <d>; edit_plot.D <w>\" " + fg + " 0 0 1\n")
-
-def writePool(modelpath,f,volIndex ):
-        for p in wildcardFind(modelpath+'/##[ISA=PoolBase]'):
-                slave_enable = 0
-                if (p.className == "BufPool" or p.className == "ZombieBufPool"):
-                        pool_children = p.children
-                        if pool_children== 0:
-                                slave_enable = 4
-                        else:
-                                for pchild in pool_children:
-                                        if not(pchild.className == "ZombieFunction") and not(pchild.className == "Function"):
-                                                slave_enable = 4
-                                        else:
-                                                slave_enable = 0
-                                                break
-
-                xp = cord[p]['x']
-                yp = cord[p]['y']
-                x = ((xp-xmin)/(xmax-xmin))*multi
-                y = ((yp-ymin)/(ymax-ymin))*multi
-                #y = ((ymax-yp)/(ymax-ymin))*multi
-
-                pinfo = p.path+'/info'
-                if exists(pinfo):
-                        color = Annotator(pinfo).getField('color')
-                        textcolor = Annotator(pinfo).getField('textColor')
-
-                geometryName = volIndex[p.volume]
-                volume = p.volume * NA * 1e-3
-                f.write("simundump kpool /kinetics/" + trimPath(p) + " 0 " +
-                        str(p.diffConst) + " " +
-                        str(0) + " " +
-                        str(0) + " " +
-                        str(0) + " " +
-                        str(p.nInit) + " " +
-                        str(0) + " " + str(0) + " " +
-                        str(volume)+ " " +
-                        str(slave_enable) +
-                        " /kinetics"+ geometryName + " " +
-                        str(color) +" " + str(textcolor) + " " + str(x) + " " + str(y) + " "+ str(0)+"\n")
-
-def getxyCord(xcord,ycord,list1,sceneitems):
-        for item in list1:
-                if not ( isinstance(item,Function) and isinstance(item,ZombieFunction) ):
-                        if sceneitems == None:
-                                objInfo = item.path+'/info'
-                                xpos = xyPosition(objInfo,'x')
-                                ypos = xyPosition(objInfo,'y')
-                        else:
-                                co = sceneitems[item]
-                                xpos = co.scenePos().x()
-                                ypos =-co.scenePos().y()
-                        cord[item] ={ 'x': xpos,'y':ypos}
-                        xcord.append(xpos)
-                        ycord.append(ypos)
-
-def xyPosition(objInfo,xory):
-    try:
-        return(float(element(objInfo).getField(xory)))
-    except ValueError:
-        return (float(0))
-def getCor(modelRoot,sceneitems):
-        xmin = ymin = 0.0
-        xmax = ymax = 1.0
-        positionInfoExist = False
-        xcord = ycord = []
-        mollist = realist = enzlist = cplxlist = tablist = funclist = []
-        meshEntryWildcard = '/##[ISA=ChemCompt]'
-        if modelRoot != '/':
-                meshEntryWildcard = modelRoot+meshEntryWildcard
-        for meshEnt in wildcardFind(meshEntryWildcard):
-                mol_cpl  = wildcardFind(meshEnt.path+'/##[ISA=PoolBase]')
-                realist  = wildcardFind(meshEnt.path+'/##[ISA=ReacBase]')
-                enzlist  = wildcardFind(meshEnt.path+'/##[ISA=EnzBase]')
-                funclist = wildcardFind(meshEnt.path+'/##[ISA=Function]')
-                tablist  = wildcardFind(meshEnt.path+'/##[ISA=StimulusTable]')
-                if mol_cpl or funclist or enzlist or realist or tablist:
-                        for m in mol_cpl:
-                                if isinstance(element(m.parent),CplxEnzBase):
-                                        cplxlist.append(m)
-                                        objInfo = m.parent.path+'/info'
-                                elif isinstance(element(m),PoolBase):
-                                        mollist.append(m)
-                                        objInfo =m.path+'/info'
-
-                                if sceneitems == None:
-                                        xx = xyPosition(objInfo,'x')
-                                        yy = xyPosition(objInfo,'y')
-                                else:
-                                        c = sceneitems[m]
-                                        xx = c.scenePos().x()
-                                        yy =-c.scenePos().y()
-
-                                cord[m] ={ 'x': xx,'y':yy}
-                                xcord.append(xx)
-                                ycord.append(yy)
-                        getxyCord(xcord,ycord,realist,sceneitems)
-                        getxyCord(xcord,ycord,enzlist,sceneitems)
-                        getxyCord(xcord,ycord,funclist,sceneitems)
-                        getxyCord(xcord,ycord,tablist,sceneitems)
-        xmin = min(xcord)
-        xmax = max(xcord)
-        ymin = min(ycord)
-        ymax = max(ycord)
-        positionInfoExist = not(len(np.nonzero(xcord)[0]) == 0 \
-                and len(np.nonzero(ycord)[0]) == 0)
-
-        return(xmin,ymin,xmax,ymax,positionInfoExist)
-
-def writeCompartment(modelpath,compts,f):
-        index = 0
-        volIndex = {}
-        for compt in compts:
-                if compt.name != "kinetics":
-                        xgrp = xmax -random.randrange(1,10)
-                        ygrp = ymin +random.randrange(1,10)
-                        x = ((xgrp-xmin)/(xmax-xmin))*multi
-                        #y = ((ymax-ygrp)/(ymax-ymin))*multi
-                        y = ((ygrp-ymin)/(ymax-ymin))*multi
-                        f.write("simundump group /kinetics/" + compt.name + " 0 " + "blue" + " " + "green"       + " x 0 0 \"\" defaultfile \\\n" )
-                        f.write( "  defaultfile.g 0 0 0 " + str(x) + " " + str(y) + " 0\n")
-        i = 0
-        l = len(compts)
-        geometry = ""
-        for compt in compts:
-                size = compt.volume
-                ndim = compt.numDimensions
-                vecIndex = l-i-1
-                #print vecIndex
-                i = i+1
-                xgeo = xmax -random.randrange(1,10)
-                ygeo = ymin +random.randrange(1,10)
-                x = ((xgeo-xmin)/(xmax-xmin))*multi
-                #y = ((ymax-ygeo)/(ymax-ymin))*multi
-                y = ((ygeo-ymin)/(ymax-ymin))*multi
-                if vecIndex > 0:
-                        geometry = geometry+"simundump geometry /kinetics" +  "/geometry[" + str(vecIndex) +"] 0 " + str(size) + " " + str(ndim) + " sphere " +" \"\" white black "+ str(x) + " " +str(y) +" 0\n";
-                        volIndex[size] = "/geometry["+str(vecIndex)+"]"
-                else:
-                        geometry = geometry+"simundump geometry /kinetics"  +  "/geometry 0 " + str(size) + " " + str(ndim) + " sphere " +" \"\" white black " + str(x) + " "+str(y)+ " 0\n";
-                        volIndex[size] = "/geometry"
-                f.write(geometry)
-        writeGroup(modelpath,f,xmax,ymax)
-        return volIndex
-
-def writeGroup(modelpath,f,xmax,ymax):
-        ignore = ["graphs","moregraphs","geometry","groups","conc1","conc2","conc3","conc4"]
-        for g in wildcardFind(modelpath+'/##[TYPE=Neutral]'):
-                if not g.name in ignore:
-                        if trimPath(g) != None:
-                                xgrp1 = xmax - random.randrange(1,10)
-                                ygrp1 = ymin + random.randrange(1,10)
-                                x = ((xgrp1-xmin)/(xmax-xmin))*multi
-                                #y = ((ymax-ygrp1)/(ymax-ymin))*multi
-                                y = ((ygrp1-ymin)/(ymax-ymin))*multi
-                                f.write("simundump group /kinetics/" + trimPath(g) + " 0 " +    "blue" + " " + "green"   + " x 0 0 \"\" defaultfile \\\n")
-                                f.write("  defaultfile.g 0 0 0 " + str(x) + " " + str(y) + " 0\n")
-
-def writeHeader(f,maxVol):
-        simdt = 0.001
-        plotdt = 0.1
-        rawtime = 100
-        maxtime = 100
-        defaultVol = maxVol
-        f.write("//genesis\n"
-                        "// kkit Version 11 flat dumpfile\n\n"
-                        "// Saved on " + str(rawtime)+"\n"
-                        "include kkit {argv 1}\n"
-                        "FASTDT = " + str(simdt)+"\n"
-                        "SIMDT = " +str(simdt)+"\n"
-                        "CONTROLDT = " +str(plotdt)+"\n"
-                        "PLOTDT = " +str(plotdt)+"\n"
-                        "MAXTIME = " +str(maxtime)+"\n"
-                        "TRANSIENT_TIME = 2"+"\n"
-                        "VARIABLE_DT_FLAG = 0"+"\n"
-                        "DEFAULT_VOL = " +str(defaultVol)+"\n"
-                        "VERSION = 11.0 \n"
-                        "setfield /file/modpath value ~/scripts/modules\n"
-                        "kparms\n\n"
-                        )
-        f.write( "//genesis\n"
-                        "initdump -version 3 -ignoreorphans 1\n"
-                        "simobjdump table input output alloced step_mode stepsize x y z\n"
-                        "simobjdump xtree path script namemode sizescale\n"
-                        "simobjdump xcoredraw xmin xmax ymin ymax\n"
-                        "simobjdump xtext editable\n"
-                        "simobjdump xgraph xmin xmax ymin ymax overlay\n"
-                        "simobjdump xplot pixflags script fg ysquish do_slope wy\n"
-                        "simobjdump group xtree_fg_req xtree_textfg_req plotfield expanded movealone \\\n"
-                                "  link savename file version md5sum mod_save_flag x y z\n"
-                        "simobjdump geometry size dim shape outside xtree_fg_req xtree_textfg_req x y z\n"
-                        "simobjdump kpool DiffConst CoInit Co n nInit mwt nMin vol slave_enable \\\n"
-                                "  geomname xtree_fg_req xtree_textfg_req x y z\n"
-                        "simobjdump kreac kf kb notes xtree_fg_req xtree_textfg_req x y z\n"
-                        "simobjdump kenz CoComplexInit CoComplex nComplexInit nComplex vol k1 k2 k3 \\\n"
-                                "  keepconc usecomplex notes xtree_fg_req xtree_textfg_req link x y z\n"
-                        "simobjdump stim level1 width1 delay1 level2 width2 delay2 baselevel trig_time \\\n"
-                                "  trig_mode notes xtree_fg_req xtree_textfg_req is_running x y z\n"
-                        "simobjdump xtab input output alloced step_mode stepsize notes editfunc \\\n"
-                                "  xtree_fg_req xtree_textfg_req baselevel last_x last_y is_running x y z\n"
-                        "simobjdump kchan perm gmax Vm is_active use_nernst notewriteReacs xtree_fg_req \\\n"
-                                "  xtree_textfg_req x y z\n"
-                        "simobjdump transport input output alloced step_mode stepsize dt delay clock \\\n"
-                                "  kf xtree_fg_req xtree_textfg_req x y z\n"
-                        "simobjdump proto x y z\n"
-                        )
-
-def estimateDefaultVol(compts):
-        maxVol = 0
-        vol = []
-        for compt in compts:
-                vol.append(compt.volume)
-        if len(vol) > 0:
-                return max(vol)
-        return maxVol
-
-def writeGui( f ):
-        f.write("simundump xgraph /graphs/conc1 0 0 99 0.001 0.999 0\n"
-        "simundump xgraph /graphs/conc2 0 0 100 0 1 0\n"
-        "simundump xgraph /moregraphs/conc3 0 0 100 0 1 0\n"
-        "simundump xgraph /moregraphs/conc4 0 0 100 0 1 0\n"
-        "simundump xcoredraw /edit/draw 0 -6 4 -2 6\n"
-        "simundump xtree /edit/draw/tree 0 \\\n"
-        "  /kinetics/#[],/kinetics/#[]/#[],/kinetics/#[]/#[]/#[][TYPE!=proto],/kinetics/#[]/#[]/#[][TYPE!=linkinfo]/##[] \"edit_elm.D <v>; drag_from_edit.w <d> <S> <x> <y> <z>\" auto 0.6\n"
-        "simundump xtext /file/notes 0 1\n")
-
-def writeFooter( f ):
-        f.write( "\nenddump\n" +
-           "complete_loading\n")
-
-if __name__ == "__main__":
-        import sys
-
-        filename = sys.argv[1]
-        modelpath = filename[0:filename.find('.')]
-        loadModel('/home/harsha/genesis_files/gfile/'+filename,'/'+modelpath,"gsl")
-        output = '/home/harsha/Desktop/moose2genesis/moosefolder_cmd__sep2_'+filename
-        written = writeKkit('/'+modelpath,output)
-        if written:
-                print(" file written to ",output)
-        else:
-                print(" could be written to kkit format")
diff --git a/python/rdesigneur/rdesigneur.py b/python/rdesigneur/rdesigneur.py
index 803d3144..e267921c 100644
--- a/python/rdesigneur/rdesigneur.py
+++ b/python/rdesigneur/rdesigneur.py
@@ -27,9 +27,28 @@ import rmoogli
 from rdesigneurProtos import *
 from moose.neuroml.NeuroML import NeuroML
 from moose.neuroml.ChannelML import ChannelML
-import lxml
-from lxml import etree
-import h5py as h5
+
+try:
+  from lxml import etree
+except ImportError:
+  try:
+    # Python 2.5
+    import xml.etree.cElementTree as etree
+  except ImportError:
+    try:
+      # Python 2.5
+      import xml.etree.ElementTree as etree
+    except ImportError:
+      try:
+        # normal cElementTree install
+        import cElementTree as etree
+      except ImportError:
+        try:
+          # normal ElementTree install
+          import elementtree.ElementTree as etree
+        except ImportError:
+          print("Failed to import ElementTree from any known place")
+
 import csv
 
 #EREST_ACT = -70e-3
@@ -495,7 +514,7 @@ class rdesigneur:
             return (), ""
 
         kf = knownFields[field] # Find the field to decide type.
-        if ( kf[0] == 'CaConcBase' or kf[0] == 'ChanBase' ):
+        if ( kf[0] == 'CaConcBase' or kf[0] == 'ChanBase' or kf[0] == 'NMDAChan' ):
             objList = self._collapseElistToPathAndClass( comptList, plotSpec[2], kf[0] )
             # print ("objList: ", len(objList), kf[1])
             return objList, kf[1]
@@ -540,6 +559,7 @@ class rdesigneur:
             'Gbar':('ChanBase', 'getGbar', 1e9, 'chan max conductance (nS)' ),
             'Gk':('ChanBase', 'getGk', 1e9, 'chan conductance (nS)' ),
             'Ik':('ChanBase', 'getIk', 1e9, 'chan current (nA)' ),
+            'ICa':('NMDAChan', 'getICa', 1e9, 'Ca current (nA)' ),
             'Ca':('CaConcBase', 'getCa', 1e3, 'Ca conc (uM)' ),
             'n':('PoolBase', 'getN', 1, '# of molecules'),
             'conc':('PoolBase', 'getConc', 1000, 'Concentration (uM)' )
@@ -575,14 +595,15 @@ class rdesigneur:
     def _buildMoogli( self ):
         knownFields = {
             'Vm':('CompartmentBase', 'getVm', 1000, 'Memb. Potential (mV)', -80.0, 40.0 ),
-            'Im':('CompartmentBase', 'getIm', 1e9, 'Memb. current (nA)', -10, 10 ),
-            'inject':('CompartmentBase', 'getInject', 1e9, 'inject current (nA)', -10, 10 ),
-            'Gbar':('ChanBase', 'getGbar', 1e9, 'chan max conductance (nS)', 0, 1 ),
-            'Gk':('ChanBase', 'getGk', 1e9, 'chan conductance (nS)', 0, 1 ),
-            'Ik':('ChanBase', 'getIk', 1e9, 'chan current (nA)', -10, 10 ),
-            'Ca':('CaConcBase', 'getCa', 1e3, 'Ca conc (uM)', 0, 10 ),
-            'n':('PoolBase', 'getN', 1, '# of molecules', 0, 200 ),
-            'conc':('PoolBase', 'getConc', 1000, 'Concentration (uM)', 0, 2 )
+            'Im':('CompartmentBase', 'getIm', 1e9, 'Memb. current (nA)', -10.0, 10.0 ),
+            'inject':('CompartmentBase', 'getInject', 1e9, 'inject current (nA)', -10.0, 10.0 ),
+            'Gbar':('ChanBase', 'getGbar', 1e9, 'chan max conductance (nS)', 0.0, 1.0 ),
+            'Gk':('ChanBase', 'getGk', 1e9, 'chan conductance (nS)', 0.0, 1.0 ),
+            'Ik':('ChanBase', 'getIk', 1e9, 'chan current (nA)', -10.0, 10.0 ),
+            'ICa':('NMDAChan', 'getICa', 1e9, 'Ca current (nA)', -10.0, 10.0 ),
+            'Ca':('CaConcBase', 'getCa', 1e3, 'Ca conc (uM)', 0.0, 10.0 ),
+            'n':('PoolBase', 'getN', 1, '# of molecules', 0.0, 200.0 ),
+            'conc':('PoolBase', 'getConc', 1000, 'Concentration (uM)', 0.0, 2.0 )
         }
         moogliBase = moose.Neutral( self.modelPath + '/moogli' )
         k = 0
@@ -646,6 +667,7 @@ class rdesigneur:
             'Gbar':('ChanBase', 'getGbar', 1e9, 'chan max conductance (nS)' ),
             'Gk':('ChanBase', 'getGk', 1e9, 'chan conductance (nS)' ),
             'Ik':('ChanBase', 'getIk', 1e9, 'chan current (nA)' ),
+            'ICa':('NMDAChan', 'getICa', 1e9, 'Ca current (nA)' ),
             'Ca':('CaConcBase', 'getCa', 1e3, 'Ca conc (uM)' ),
             'n':('PoolBase', 'getN', 1, '# of molecules'),
             'conc':('PoolBase', 'getConc', 1000, 'Concentration (uM)' )
@@ -696,6 +718,7 @@ class rdesigneur:
             'Gbar':('ChanBase', 'getGbar', 1e9, 'chan max conductance (nS)' ),
             'Gk':('ChanBase', 'getGk', 1e9, 'chan conductance (nS)' ),
             'Ik':('ChanBase', 'getIk', 1e9, 'chan current (nA)' ),
+            'ICa':('NMDAChan', 'getICa', 1e9, 'Ca current (nA)' ),
             'Ca':('CaConcBase', 'getCa', 1e3, 'Ca conc (uM)' ),
             'n':('PoolBase', 'getN', 1, '# of molecules'),
             'conc':('PoolBase', 'getConc', 1000, 'Concentration (uM)' )
diff --git a/python/rdesigneur/rdesigneurProtos.py b/python/rdesigneur/rdesigneurProtos.py
index a0c3f956..48c7f904 100644
--- a/python/rdesigneur/rdesigneurProtos.py
+++ b/python/rdesigneur/rdesigneurProtos.py
@@ -105,6 +105,40 @@ def make_HH_K(name = 'HH_K', parent='/library', vmin=-120e-3, vmax=40e-3, vdivs=
     k.tick = -1
     return k
 
+#========================================================================
+#                SynChan: Glu receptor
+#========================================================================
+
+def make_glu( name ):
+	if moose.exists( '/library/' + name ):
+		return
+	glu = moose.SynChan( '/library/' + name )
+	glu.Ek = 0.0
+	glu.tau1 = 2.0e-3
+	glu.tau2 = 9.0e-3
+        sh = moose.SimpleSynHandler( glu.path + '/sh' )
+        moose.connect( sh, 'activationOut', glu, 'activation' )
+        sh.numSynapses = 1
+        sh.synapse[0].weight = 1
+        return glu
+
+#========================================================================
+#                SynChan: GABA receptor
+#========================================================================
+
+def make_GABA( name ):
+	if moose.exists( '/library/' + name ):
+		return
+	GABA = moose.SynChan( '/library/' + name )
+	GABA.Ek = EK + 10.0e-3
+	GABA.tau1 = 4.0e-3
+	GABA.tau2 = 9.0e-3
+        sh = moose.SimpleSynHandler( GABA.path + '/sh' )
+        moose.connect( sh, 'activationOut', GABA, 'activation' )
+        sh.numSynapses = 1
+        sh.synapse[0].weight = 1
+
+
 def makeChemOscillator( name = 'osc', parent = '/library' ):
     model = moose.Neutral( parent + '/' + name )
     compt = moose.CubeMesh( model.path + '/kinetics' )
@@ -208,14 +242,20 @@ def transformNMDAR( path ):
             moose.connect( caconc[0], 'concOut', nmdar, 'assignIntCa' )
     ################################################################
     # Utility function for building a compartment, used for spines.
-def buildCompt( pa, name, length, dia, xoffset, RM, RA, CM ):
+    # Builds a compartment object downstream (further away from soma)
+    # of the specfied previous compartment 'pa'. If 'pa' is not a
+    # compartment, it builds it on 'pa'. It places the compartment
+    # on the end of 'prev', and at 0,0,0 otherwise.
+
+def buildCompt( pa, name, RM = 1.0, RA = 1.0, CM = 0.01, dia = 1.0e-6, x = 0.0, y = 0.0, z = 0.0, dx = 10e-6, dy = 0.0, dz = 0.0 ):
+    length = np.sqrt( dx * dx + dy * dy + dz * dz )
     compt = moose.Compartment( pa.path + '/' + name )
-    compt.x0 = xoffset
-    compt.y0 = 0
-    compt.z0 = 0
-    compt.x = length + xoffset
-    compt.y = 0
-    compt.z = 0
+    compt.x0 = x
+    compt.y0 = y
+    compt.z0 = z
+    compt.x = dx + x
+    compt.y = dy + y
+    compt.z = dz + z
     compt.diameter = dia
     compt.length = length
     xa = dia * dia * PI / 4.0
@@ -225,6 +265,9 @@ def buildCompt( pa, name, length, dia, xoffset, RM, RA, CM ):
     compt.Cm = CM * sa
     return compt
 
+def buildComptWrapper( pa, name, length, dia, xoffset, RM, RA, CM ):
+    return buildCompt( pa, name, RM, RA, CM, dia = dia, x = xoffset, dx = length )
+
     ################################################################
     # Utility function for building a synapse, used for spines.
 def buildSyn( name, compt, Ek, tau1, tau2, Gbar, CM ):
@@ -302,10 +345,10 @@ def addSpineProto( name = 'spine',
         chanList = (),
         caTau = 0.0
         ):
-    assert moose.exists( parent ), "%s must exists" % parent
+    assert( moose.exists( parent ) ), "%s must exist" % parent
     spine = moose.Neutral( parent + '/' + name )
-    shaft = buildCompt( spine, 'shaft', shaftLen, shaftDia, 0.0, RM, RA, CM )
-    head = buildCompt( spine, 'head', headLen, headDia, shaftLen, RM, RA, CM )
+    shaft = buildComptWrapper( spine, 'shaft', shaftLen, shaftDia, 0.0, RM, RA, CM )
+    head = buildComptWrapper( spine, 'head', headLen, headDia, shaftLen, RM, RA, CM )
     moose.connect( shaft, 'axial', head, 'raxial' )
 
     if caTau > 0.0:
@@ -349,7 +392,7 @@ def makePassiveHHsoma(name = 'passiveHHsoma', parent='/library'):
     if not moose.exists( elecpath ):
         elecid = moose.Neuron( elecpath )
         dia = 500e-6
-        soma = buildCompt( elecid, 'soma', dia, dia, 0.0,
+        soma = buildComptWrapper( elecid, 'soma', dia, dia, 0.0,
             0.33333333, 3000, 0.01 )
         soma.initVm = -65e-3 # Resting of -65, from HH
         soma.Em = -54.4e-3 # 10.6 mV above resting of -65, from HH
diff --git a/python/rdesigneur/rmoogli.py b/python/rdesigneur/rmoogli.py
index 9a527ed8..7f8fa883 100644
--- a/python/rdesigneur/rmoogli.py
+++ b/python/rdesigneur/rmoogli.py
@@ -12,15 +12,30 @@ import math
 import matplotlib
 import sys
 import moose
+import os
+
+# Check if DISPLAY environment variable is properly set. If not, warn the user
+# and continue.
+hasDisplay = True
+display = os.environ.get('DISPLAY',  '' )
+if not display:
+    hasDisplay = False
+    print( "Warning: Environment variable DISPLAY is not set."
+            " Did you forget to pass -X or -Y switch to ssh command?\n" 
+            "Anyway, MOOSE will continue without graphics.\n"
+            )
+
 hasMoogli = True
-try: 
-    from PyQt4 import QtGui
-    import moogli
-    import moogli.extensions.moose
-    app = QtGui.QApplication(sys.argv)
-except Exception as e:
-    print( 'Warning: Moogli not found. All moogli calls will use dummy functions' )
-    hasMoogli = False
+
+if hasDisplay:
+    try: 
+        from PyQt4 import QtGui
+        import moogli
+        import moogli.extensions.moose
+        app = QtGui.QApplication(sys.argv)
+    except Exception as e:
+        print( 'Warning: Moogli not found. All moogli calls will use dummy functions' )
+        hasMoogli = False
 
 
 runtime = 0.0
@@ -40,15 +55,13 @@ def getComptParent( obj ):
 def prelude( view ):
     view.home()
     view.pitch( math.pi / 2.0 )
-    view.zoom( 0.3 )
+    view.zoom( 0.05 )
     #network.groups["soma"].set( "color", moogli.colors.RED )
 
 # This func is used for the first viewer, it has to handle advancing time.
 def interlude( view ):
     moose.start( moogliDt )
     val = [ moose.getField( i, view.mooField, "double" ) * view.mooScale for i in view.mooObj ]
-    #print "LEN = ", len( val ), "field = ", view.mooField
-    
     view.mooGroup.set("color", val, view.mapper)
     view.yaw( rotation )
     #print moogliDt, len( val ), runtime
@@ -132,6 +145,7 @@ def makeMoogli( rd, mooObj, moogliEntry, fieldInfo ):
                                  scalar_range=moogli.geometry.Vec2f(
                                      moogliEntry[5],
                                      moogliEntry[6]))
+    cb.set_num_labels(3)
     view.attach_color_bar(cb)
     view.rd = rd
     view.mooObj = displayObj
diff --git a/sbml/CMakeLists.txt b/sbml/CMakeLists.txt
deleted file mode 100644
index f08adef7..00000000
--- a/sbml/CMakeLists.txt
+++ /dev/null
@@ -1,9 +0,0 @@
-file(GLOB files_SRC "*.cpp")
-
-IF(LIBSBML_FOUND)
-    add_definitions(-DUSE_SBML)
-ENDIF(LIBSBML_FOUND)
-
-include_directories(../msg)
-include_directories(../basecode)
-add_library(moose_sbml ${files_SRC})
diff --git a/sbml/Makefile b/sbml/Makefile
deleted file mode 100644
index 445c3582..00000000
--- a/sbml/Makefile
+++ /dev/null
@@ -1,31 +0,0 @@
-#/**********************************************************************
-#** This program is part of 'MOOSE', the
-#** Messaging Object Oriented Simulation Environment,
-#** also known as GENESIS 3 base code.
-#**           copyright (C) 2004 Upinder S. Bhalla. and NCBS
-#** It is made available under the terms of the
-#** GNU Lesser General Public License version 2.1
-#** See the file COPYING.LIB for the full notice.
-#**********************************************************************/
-
-TARGET = _sbml.o
-
-OBJ = \
-	MooseSbmlWriter.o \
-	MooseSbmlReader.o \
-
-HEADERS = \
-	../basecode/header.h
-
-default: $(TARGET)
-
-$(OBJ)	: $(HEADERS)
-MooseSbmlWriter.o:	MooseSbmlWriter.h
-MooseSbmlReader.o:   MooseSbmlReader.h
-.cpp.o:
-	$(CXX) $(CXXFLAGS) -I. -I../basecode -I../msg $< -c
-
-$(TARGET): $(OBJ) $(HEADERS)
-	$(LD) -r -o $(TARGET) $(OBJ)
-clean:
-	rm -f *.o $(TARGET) core core.*
diff --git a/sbml/MooseSbmlReader.cpp b/sbml/MooseSbmlReader.cpp
deleted file mode 100644
index 8edc3523..00000000
--- a/sbml/MooseSbmlReader.cpp
+++ /dev/null
@@ -1,1468 +0,0 @@
-/*******************************************************************
- * File:            MooseSbmlReader.cpp
- * Description:
- * Author:          HarshaRani
- * E-mail:          hrani@ncbs.res.in
- ********************************************************************/
-/**********************************************************************
-** This program is part of 'MOOSE', the
-** Messaging Object Oriented Simulation Environment,
-** also known as GENESIS 3 base code.
-**           copyright (C) 2003-2016 Upinder S. Bhalla. and NCBS
-** It is made available under the terms of the
-** GNU Lesser General Public License version 2.1
-** See the file COPYING.LIB for the full notice.
-**********************************************************************/
-/****************************
-* Change log:
-
-* Originally created by Siji for l2v4 for 'trunk ' branch
-* Modified / adapted Harsharani for both l2v4 and l3v1
-
-hasOnlySubstanceUnit : false means if the compartment size is changed then it is assumed that its the concentration that must be updated
-to account for the size change.
-
-When importing SBML Level 2 models species for which the 
-A. initial value is given as initialAmount and hasOnlySubstanceUnits is set to true 
-(or compartment dimension is zero) are treated as amounts. 
-
-B. If hasOnlySubstanceUnits is set to false 
-but the initial value is given as amount the corresponding species are not converted to concentration, rather I substitue amount
-as moose can take nInit. 
-
-C. Species for which the initial value is given as initialConcentration are treated as concentrations by converted to milli
-
-It is then assumed that the value of species appearing in the kinetic rate laws have either amount or concentration units. 
-
-All rules are evaluated as given by the SBML model.
-According to the SBML standard rate laws of reactions are assumed to deliver a rate in amount/time. 
-In the case a species value is defined as concentration the rate law is converted to concentration/time.
-
-In models that have only compartments of a constant size of 1 the distinction between amounts and concentrations is not necessary. 
-***************/
-
-
-#ifdef USE_SBML
-
-#include <cmath>
-#include <stdexcept>
-#include <sbml/SBMLTypes.h>
-#include <sbml/UnitDefinition.h>
-#include <sbml/units/UnitFormulaFormatter.h>
-#include <sbml/units/FormulaUnitsData.h>
-#include <string>
-#include <stdlib.h>
-#include "header.h"
-#include "../shell/Shell.h"
-#include "../shell/Wildcard.h"
-//#include "../manager/SimManager.h"
-#include "MooseSbmlReader.h"
-//#include "../kinetics/FuncPool.h"
-
-using namespace std;
-map< string,double > parmValueMap;
-map< string,double> :: iterator pvm_iter;
-bool unitsDefined = true;
-/*  Harsha : TODO in
-    -Compartment
-      --Need to add group
-      --Need to deal with compartment outside
-    -Molecule
-      -- Need to add group (done commited to 6964)
-      -- Func pool and its math calculation need to be added.
-    -Loading Model from SBML
-      --Tested 1-30 testcase example model provided by l3v1 and l2v4 std.
-        ---These are the models that worked (sbml testcase)1-6,10,14-15,17-21,23-25,34,35,58
-	---Need to check
-	 ----what to do when boundarycondition is true i.e.,
-             differential equation derived from the reaction definitions
-             should not be calculated for the species(7-9,11-13,16)
-         ----kineticsLaw, Math fun has fraction,ceiling,reminder,power 28etc.
-         ----Events to be added 26
-	 ----initial Assisgment for compartment 27
-         ----when stoichiometry is rational number 22
-	 ---- For Michaelis¡VMenten kinetics km is not defined which is most of the case
-	      need to calculate
-*/
-
-/**
- * @brief Reads a given SBML file and loads it into MOOSE.
- *
- * @param filename Name of file, std::string.
- * @param location 
- * @param solverClass
- *
- * @return  Id on success. Some expcetion on failure.
- */
-Id moose::SbmlReader::read( string filename, string location, string solverClass) 
-{   stringstream global_warning;
-    FILE * fp = fopen( filename.c_str(), "r" );
-    if ( fp == NULL) {
-        stringstream ss;
-        ss << "File " << filename << " does not exist." << endl;
-        throw runtime_error(ss.str());
-    }
-
-    document_ = readSBML( filename.c_str() );
-    unsigned num_errors = document_->getNumErrors();
-    if ( num_errors > 0 ) {
-        cerr << "Errors encountered while reading" << endl;
-        document_->printErrors( cerr );
-        errorFlag_ = true;
-        return baseId;
-    }
-    model_= document_->getModel();
-    if ( model_ == 0 ) {
-        cout << "SBML: Error: No model present." << endl;
-        errorFlag_ = true;
-        return baseId;
-    }
-    if ( !errorFlag_ )
-        getGlobalParameter();
-
-    if ( !errorFlag_ ) {
-        
-        string modelName;
-        Id parentId;
-        findModelParent ( Id(), location, parentId, modelName ) ;
-        Shell* s = reinterpret_cast< Shell* >( Id().eref().data() );
-        Id baseId_ = s->doCreate( "Neutral", parentId, modelName, 1, MooseGlobal);
-        Id base_ =s->doCreate("Neutral",baseId_,"model",1,MooseGlobal);
-        assert( base_ != Id() );
-        //Map Compartment's SBML id to Moose ID
-        map< string,Id > comptSidMIdMap;
-        // Map between Molecule's SBML id to which it belongs compartment Moose Id
-        map< string, Id > molSidcmptMIdMap;
-
-        if ( !errorFlag_ ){
-            unitsDefined = true;
-            comptSidMIdMap = createCompartment(location, parentId, modelName, base_);
-            //comptUnitDefined is set true is checked if warning is set or not, only once its set.
-            if (unitsDefined == false)
-                global_warning << "The default volume unit has not been set in the model. "<<
-                                "Assuming liter as the default volume unit, MOOSE will convert to cubicMeter which is the default units for volume in MOOSE. \n";
-        }
-        if ( !errorFlag_ ){
-            unitsDefined = true;
-            molSidcmptMIdMap = createMolecule( comptSidMIdMap);
-            if (unitsDefined == false)
-                //SpeciesUnitDefined is set true is checked if warning is set or not, only once its set.
-                global_warning << "The default substance unit has not been set in the model. "<<
-                                "Assuming mole as the default substance unit, MOOSE will convert to milliMolar which is the default units for Substabce in MOOSE  \n";
-        }
-        if ( !errorFlag_ )
-            getRules();
-
-        if ( !errorFlag_ )
-            createReaction( molSidcmptMIdMap );
-        // or we get
-        //createReaction (result);
-
-        if ( errorFlag_ )
-            return baseId;
-        else {
-            // SimManager* sm = reinterpret_cast< SimManager* >(baseId.eref().data());
-            //Shell* s = reinterpret_cast< Shell* >(baseId.eref().data());
-            XMLNode * annotationNode = model_->getAnnotation();
-            if( annotationNode != NULL ) {
-                unsigned int num_children = annotationNode->getNumChildren();
-                for( unsigned int child_no = 0; child_no < num_children; child_no++ ) {
-                    XMLNode childNode = annotationNode->getChild( child_no );
-                    if ( childNode.getPrefix() == "moose" && childNode.getName() == "ModelAnnotation" ) {
-                        unsigned int num_gchildren = childNode.getNumChildren();
-                        for( unsigned int gchild_no = 0; gchild_no < num_gchildren; gchild_no++ ) {
-                            XMLNode &grandChildNode = childNode.getChild( gchild_no );
-                            string nodeName = grandChildNode.getName();
-                            if (grandChildNode.getNumChildren() == 1 ) {
-                                string plotValue;
-                                //double nodeValue;
-                                if(nodeName == "plots") {
-                                    Id graphs;
-                                    // Carrying on with the policy that all graphs will be created under /modelName
-                                    string datapath = baseId_.path() +"/data";
-                                    Id graphpath(datapath);
-                                    graphs = datapath;
-                                    graphs = s->doCreate("Neutral",baseId_,"data",1);
-                                    assert(graphs != Id());
-                                    Id graph;
-                                    string datagrph = graphs.path()+"/graph_1";
-                                    Id graph1(datagrph);
-                                    graph = s->doCreate("Neutral",graphs,"graph_0",1);
-                                    assert(graph != Id());
-                                    /*
-                                    // if plots exist then will be placing at "/data"
-                                    Id graphs;
-                                    //Id dataId;
-                                    Id dataIdTest;
-                                    if (parentId2 == Id())
-                                        graphs = s->doCreate( "Neutral", parentId2, "data", 1);
-                                    else
-                                        // need to check how to put / while coming from gui as the path is /model/modelName??? 27 jun 2014
-                                        findModelParent ( Id(), modelName, dataIdTest, modelName ) ;
-                                        string test = "/data";
-                                        Id tgraphs(test);
-                                        graphs=tgraphs;
-                                        //graphs = s->doCreate("Neutral",parentId,"data",1);
-                                        //Id dataId;
-                                        //if (dataId == Id())
-                                        //    cout << "Id " << dataId;
-                                        //    graphs = s->doCreate( "Neutral",dataId, "data", 1);
-                                        assert( graphs != Id() );
-                                        */  
-                                    plotValue = (grandChildNode.getChild(0).toXMLString()).c_str();
-                                    istringstream pltVal(plotValue);
-                                    string pltClean;
-                                    while (getline(pltVal,pltClean, ';')) {
-                                        pltClean.erase( remove( pltClean.begin(), pltClean.end(), ' ' ), pltClean.end() );
-                                        //string plotPath = location+pltClean;
-                                        string plotPath = base_.path()+pltClean;
-                                        Id plotSId(plotPath);
-                                        size_t pos = pltClean.find('/');
-                                        if (pos != std::string::npos)
-                                            pltClean = pltClean.substr(pos+1,pltClean.length());
-                                        /*
-                                        #Harsha:To create a tableName, e.g:'/compartmentName/groupName/ObjectName'
-                                        #       I have changed '/' to '@' and To keep the index of the ObjectName
-                                        #       I have changed '[' to '<' and ']' to '>'.
-                                        #       The same is follwed in the GUI
-                                        */
-                                        replace(pltClean.begin(),pltClean.end(),'/','@');
-                                        replace(pltClean.begin(),pltClean.end(),'[','<');
-                                        replace(pltClean.begin(),pltClean.end(),']','>');
-                                        // size_t Iindex = 0;
-                                        // while(true)
-                                        //     { size_t sindex = pltClean.find('[',Iindex);
-                                        //       size_t eindex = pltClean.find(']',Iindex);
-                                        //       if (sindex == std::string::npos) break;
-                                        //       pltClean.erase(sindex,eindex-sindex+1);
-                                        //       Iindex = eindex;
-                                        //     } //while true
-                                        string plotName =  pltClean + ".conc";
-                                        Id pltPath(graph.path());
-                                        Id tab = s->doCreate( "Table2", pltPath, plotName, 1 );
-                                        if (tab != Id())
-                                            s->doAddMsg("Single",tab,"requestOut",plotSId,"getConc");
-                                    }//while
-                                    /* passing /model and /data to clocks         */
-                                    //commented due to automatic scheduling
-                                    
-                                    /*
-                                    string comptPath =base_.path()+"/##";
-                                    s->doUseClock(comptPath,"process",4);
-
-                                    string tablePath = graphs.path()+"/##[TYPE=Table]";
-                                    s->doUseClock( tablePath, "process",8 );
-                                    */
-                                }//plots
-                                /*else
-                                  nodeValue = atof ((grandChildNode.getChild(0).toXMLString()).c_str());
-
-                                  if (nodeName == "runTime")
-                                  sm->setRunTime(nodeValue);
-                                  else if (nodeName == "simdt")
-                                  sm->setSimDt(nodeValue);
-                                  else if(nodeName == "plotdt")
-                                  sm->setPlotDt(nodeValue);
-                                  */
-
-                            } //grandChild
-                            else
-                                cout << "Warning: expected exactly ONE child of " << nodeName << " but none found "<<endl;
-                        } //gchild
-                    } //moose and modelAnnotation
-                }
-            }//annotation Node
-            else {
-                //4 for simdt and 8 for plotdt
-                //Harsha:Since scheduling is automatically done commeting this
-                
-                //s->doUseClock(base_.path()+"/##","process",4);
-                //s->doUseClock(+"/data/##[TYPE=Table]","process",8);
-                //s->doSetClock(4,0.1);
-                //s->doSetClock(8,0.1);
-                /*
-                s->doUseClock( "/data/##[TYPE=Table]", "proc", 16 );
-                double simdt = 0.1;
-                double plotdt = 1;
-                s->doSetClock( 11, simdt );
-                s->doSetClock( 12, simdt );
-                s->doSetClock( 13, simdt );
-                s->doSetClock( 14, simdt );
-                s->doSetClock( 16, plotdt );
-                s->doSetClock( 17, plotdt );
-                s->doSetClock( 18, plotdt );
-                */
-            }
-            vector< ObjId > compts;
-            string comptpath = base_.path()+"/##[ISA=ChemCompt]";
-            wildcardFind( comptpath, compts );
-            vector< ObjId >::iterator i = compts.begin();
-            string comptName = nameString(Field<string> :: get(ObjId(*i),"name"));
-            string simpath = base_.path() + "/##";
-            //s->doUseClock( simpath, "process", 4 );
-
-            //wildcardFind( plotpath, plots );
-            //Id pathexist(base_.path()+"/kinetics");
-            /*
-               if (solverClass.empty())
-               {
-               if( pathexist != Id())
-               sm->build(base_.eref(),&q,"rk5");
-               else
-               sm->buildForSBML(base_.eref(),&q,"rk5");
-               }
-               else
-               { if(pathexist != Id())
-               sm->build(base_.eref(),&q,solverClass);
-               else
-               sm->buildForSBML(base_.eref(),&q,solverClass);
-               }
-               */
-            //cout << "base_ " <<base_.path() << "baseId_ " << baseId_.path();
-            return baseId_;
-        }
-
-    } else
-        return baseId;
-}
-
-/**
- * @brief Map SBML compartments to MOOSE.
- *
- * @param location 
- * @param parentId string. Name of parent compartment.
- * @param modelName string. Name of model.
- * @param base_ Id, Id of parent.
- *
- * @return std::map<string, Id>.
- */
-map< string,Id > moose::SbmlReader::createCompartment(string location, Id parentId, string modelName, Id base_) 
-{
-    /* In compartment: pending to add
-       -- the group
-       -- outside    
-       -- units of volume
-    */
-    Shell* s = reinterpret_cast< Shell* >( Id().eref().data() );
-    map< string,Id > comptSidMIdMap;
-    map< string,string > outsideMap;
-    map< string,string > ::iterator iter;
-    double msize = 0.0, size = 0.0;
-
-    ::Compartment* compt;
-    unsigned int num_compts = model_->getNumCompartments();
-
-    if (num_compts == 0) {
-        errorFlag_ = true;
-        stringstream ss;
-        return comptSidMIdMap;
-    }
-
-    baseId = base_;
-    for ( unsigned int i = 0; i < num_compts; i++ ) {
-        compt = model_->getCompartment(i);
-        std::string id = "";
-        if ( compt->isSetId() ) {
-            id = compt->getId();
-        }
-
-        std::string name = "";
-        if ( compt->isSetName() ) {
-            name = compt->getName();
-            name = nameString(name);
-        }
-
-        std::string outside = "";
-        if ( compt->isSetOutside() ) {
-            outside = compt->getOutside ();
-        }
-        if ( compt->isSetSize() ) {
-            msize = compt->getSize();
-        }
-
-        UnitDefinition * ud = compt->getDerivedUnitDefinition();
-        size = transformUnits( msize,ud , "compartment",0);
-        unsigned int dimension = compt->getSpatialDimensions();
-
-        if (dimension < 3)
-            cout << "\n ###### Spatial Dimension is " << dimension <<" volume should not be converted from liter to cubicmeter which is happening as default check \n";
-
-        if(name.empty() && id.empty())
-            cout <<  "Compartment name and id are empty" << endl;
-
-        if (name.empty()) {
-            if(! id.empty() )
-                name = id;
-        }
-
-        Id compt = s->doCreate( "CubeMesh", base_, name,  1);
-        comptSidMIdMap[id] = compt;
-        if (size != 0.0)
-            Field< double >::set( compt, "volume", size );
-        if (dimension != 0)
-            continue;
-        //Field < int > :: set(compt, "numDimensions", dimension);
-    }
-    return comptSidMIdMap;
-}
-
-/* create MOLECULE  */
-const moose::SbmlReader::sbmlStr_mooseId moose::SbmlReader::createMolecule( map< string,Id > &comptSidMIdMap) {
-    Shell* shell = reinterpret_cast< Shell* >( Id().eref().data() );
-    map< string, Id >molSidcmptMIdMap;
-    double transvalue = 0.0;
-    int num_species = model_->getNumSpecies();
-    if (num_species == 0) {
-        baseId = Id();
-        errorFlag_ = true;
-        return molSidcmptMIdMap;
-    }
-
-    for ( int sindex = 0; sindex < num_species; sindex++ ) {
-        Species* spe = model_->getSpecies(sindex);
-        
-        if (!spe) {
-            continue;
-        }
-        std::string compt = "";
-        if ( spe->isSetCompartment() ) {
-            compt = spe->getCompartment();
-        }
-        if (compt.length()< 1) {
-            //cout << "compt is empty for species "<< sindex << endl;
-            continue;
-        }
-        string id = spe->getId();
-        if (id.length() < 1) {
-            continue;
-        }
-        std::string name = "";
-        if ( spe->isSetName() ) {
-            name = spe->getName();
-            name = nameString(name);
-        }
-        if (name.empty())
-            name = id;
-        string speciesNotes = "";
-        if (spe->isSetNotes())
-        {
-            XMLNode* xnode = spe->getNotes();
-            string testnotes = spe->getNotesString();
-            XMLNode nodec = xnode->getChild(0);
-            XMLNode tnodec = nodec.getChild(0);
-            speciesNotes = tnodec.getCharacters();
-        }
-
-        Id comptEl = comptSidMIdMap[compt];
-        Id meshEntry = Neutral::child( comptEl.eref(), "mesh" );
-        string comptPath = Field<string> :: get(comptEl,"path");
-
-        // Get groupName if exist in annotation (in case of Genesis)
-        XMLNode * annotationSpe = spe->getAnnotation();
-        pair<string,pair<string, string> > group = getAnnotation_Spe_Reac(annotationSpe);
-        string groupName = group.first;
-        string xCord = group.second.first;
-        string yCord = group.second.second;
-        string groupString = comptPath+'/'+groupName;
-
-        Id groupId;
-        if (!groupName.empty())
-        {   groupId = Id( comptPath + "/"+groupName );
-            if ( groupId == Id() ) 
-                groupId = shell->doCreate( "Neutral", comptEl, groupName, 1 );
-            assert( groupId != Id() );
-            
-        }
-        bool constant = spe->getConstant();
-        bool boundaryCondition = spe->getBoundaryCondition();
-        // if (boundaryCondition == true)
-        //     cout << name << " species having BoundaryCondition true " <<endl;
-        Id pool;
-        //If constant or boundary condition is true then its equivalent to BuffPool in moose
-        if (boundaryCondition == true)
-            //if( (boundaryCondition == true) && (constant==false))
-            if (groupId == Id())
-                pool = shell->doCreate("BufPool",comptEl,name,1);
-            else
-                pool = shell->doCreate("BufPool",groupId,name,1);
-            
-        else
-            if (groupId == Id())
-                pool = shell->doCreate("Pool", comptEl, name ,1);
-            
-            else
-                pool = shell->doCreate("Pool", groupId, name ,1);
-                
-        molSidcmptMIdMap[id] = comptEl;
-        if(pool != Id())
-        {   
-            //Map to Molecule SBML id to Moose Id
-            molSidMIdMap_[id] = pool;
-
-            //shell->doAddMsg( "OneToOne",pool, "mesh", meshEntry, "mesh" );
-            bool bcondition = spe->getBoundaryCondition();
-            if ( constant == true && bcondition == false)
-                cout <<"The species "<< name << " should not appear in reactant or product as per sbml Rules"<< endl;
-
-            unsigned int spatialDimen =Field< unsigned int >::get( comptEl, "numDimensions");
-
-            UnitDefinition * ud = spe->getDerivedUnitDefinition();
-            assert(ud != NULL);
-            double initvalue = 0.0;
-            bool hasonlySubUnit = spe->getHasOnlySubstanceUnits();
-            transvalue = transformUnits(1,ud,"substance",hasonlySubUnit);
-            
-            if ( spe->isSetInitialConcentration() ) {
-                initvalue = spe->getInitialConcentration();
-                //transValue will take care of multiplying any units are defined
-                // pow(1e3) will make sure the concentration Unit are converted from mole to milliMolar (Molar = mole/size)
-                initvalue = initvalue * transvalue * pow(1e+3,1);
-                Field <double> :: set(pool, "concInit",initvalue);
-            }
-            else if ( spe->isSetInitialAmount() ) {
-                initvalue = spe->getInitialAmount();   
-                //If Amount is set then moose is capable of populating number nInit
-                // hasonlySubstanceUnit is not checked, if we populate nInit then
-                //moose automatically calculate the concInit.
-                //transValue will take care of multiplying any units are defined
-                // pow(NA) the avogadro's number to convert to number #
-                initvalue = initvalue * transvalue * pow(NA,1);
-                Field < double> :: set( pool, "nInit", initvalue);     
-            }
-            else {
-                unsigned int nr = model_->getNumRules();
-                bool found = false;
-                for ( unsigned int r = 0; r < nr; r++ ) {
-                    Rule * rule = model_->getRule(r);
-                    bool assignRule = rule->isAssignment();
-                    if ( assignRule ) {
-                        string rule_variable = rule->getVariable();
-                        if (rule_variable.compare(id) == 0) {
-                            found = true;
-                            break;
-                        }
-                    }
-                }
-                if (found == false) {
-                    cout << "Invalid SBML: Either initialConcentration or initialAmount must be set or it should be found in assignmentRule but non happening for " << spe->getName() <<endl;
-                    return molSidcmptMIdMap;
-                }
-            }
-        
-        if (!xCord.empty() and !yCord.empty()) {
-            Id poolInfo;
-            string poolPath = Field<string> :: get(pool,"path");
-            poolInfo = Id( poolPath + "/info");
-            if ( poolInfo == Id() ) 
-                poolInfo = shell->doCreate( "Annotator", pool, "info", 1 );
-            assert( poolInfo != Id() );
-            double x = atof( xCord.c_str() );
-            double y = atof( yCord.c_str() );
-            Field< double >::set( poolInfo, "x", x );
-            Field< double >::set( poolInfo, "y", y );
-            }
-        if (!speciesNotes.empty()){
-            Id poolInfo;
-            string poolPath = Field<string> :: get(pool,"path");
-            poolInfo = Id( poolPath + "/info");
-            if ( poolInfo == Id() ) 
-                poolInfo = shell->doCreate( "Annotator", pool, "info", 1 );
-            assert( poolInfo != Id() );
-            speciesNotes.erase(std::remove(speciesNotes.begin(), speciesNotes.end(), '\n'), speciesNotes.end());
-            speciesNotes.erase(std::remove(speciesNotes.begin(), speciesNotes.end(), '\t'), speciesNotes.end());
-            Field< string >::set( poolInfo, "notes", speciesNotes );
-            }
-        }//Pool_ != Id()
-    }
-    return molSidcmptMIdMap;
-}
-
-/* Assignment Rule */
-
-void moose::SbmlReader::getRules() {
-    unsigned int nr = model_->getNumRules();
-    //if (nr > 0)
-    //  cout << "\n ##### Need to populate funcpool and sumtotal which is pending due to equations \n";
-    Shell* shell = reinterpret_cast< Shell* >( Id().eref().data() );
-    for ( unsigned int r = 0; r < nr; r++ ) {
-        Rule * rule = model_->getRule(r);
-        bool assignRule = rule->isAssignment();
-        if ( assignRule ) {
-            string rule_variable = rule->getVariable();
-            map< string,Id >::iterator v_iter;
-            map< string,Id >::iterator m_iter;
-            v_iter = molSidMIdMap_.find( rule_variable );
-            if (v_iter != molSidMIdMap_.end()) {
-                Id rVariable = molSidMIdMap_.find(rule_variable)->second;
-                //string rstring =molSidMIdMap_.find(rule_variable)->first;
-                //Id sumId = shell->doCreate( "SumFunc", rVariable, "func", 1 );
-                Id sumId = shell->doCreate( "Function", rVariable, "func", 1 );
-                //rVariable.element()->zombieSwap( FuncPool::initCinfo() );
-                //ObjId ret = shell->doAddMsg( "single",
-                //                             ObjId( sumId, 0 ), "output",
-                //                             ObjId( rVariable, 0 ), "input" );
-                ObjId ret = shell->doAddMsg( "single",
-                                             ObjId( sumId, 0 ), "valueOut",
-                                             ObjId( rVariable, 0 ), "setN" );
-                assert( ret != ObjId() );
-                const ASTNode * ast = rule->getMath();
-                vector< string > ruleMembers;
-                ruleMembers.clear();
-                printMembers( ast,ruleMembers );
-                string rulePar = "";
-                string comma = "";
-                for ( unsigned int rm = 0; rm < ruleMembers.size(); rm++ ) {
-                    m_iter = molSidMIdMap_.find( ruleMembers[rm] );
-                    if ( m_iter != molSidMIdMap_.end() ) {
-                        Id rMember = molSidMIdMap_.find(ruleMembers[rm])->second;
-                        string rMember_str = molSidMIdMap_.find(ruleMembers[rm])->first;
-                        unsigned int numVars = Field< unsigned int >::get( sumId, "numVars" );
-                        ObjId xi( sumId.value() + 1, 0, numVars );
-                        Field< unsigned int >::set( sumId, "numVars", numVars + 1 );
-                        // ObjId ret = shell_->doAddMsg( "single", ObjId( srcId, 0 ), "nOut", xi, "input" ); 
-                        ObjId ret = shell->doAddMsg( "single",
-                                                     ObjId( rMember, 0 ), "nOut",
-                                                     xi, "input" );
-
-                        // ObjId ret = shell->doAddMsg( "single",
-                        //                              ObjId( rMember, 0 ), "nOut",
-                        //                              ObjId( sumId, 0 ), "input" );
-                        string test = molSidMIdMap_.find(ruleMembers[rm])->first;
-                    stringstream ss;
-                    for ( unsigned int i = 0; i < numVars; ++i ) 
-                        {
-                            ss << "x" << i << "+";
-                        }
-                    ss << "x" << numVars;
-                    Field< string >::set( sumId, "expr", ss.str() );
-                    } 
-                    else {
-                        
-                        rulePar +=  comma;
-                        rulePar += ruleMembers[rm];
-                        comma = ',';
-                        // In assignment rule there are constants instead of molecule which is yet to deal in moose.
-                        errorFlag_ = true;
-                    }
-                }
-                if (!rulePar.empty())
-                {   string t = "moose::SbmlReader::getRules: Assignment rule \"";
-                    t += rule_variable;
-                    t += "\" variable member \"";
-                    t += rulePar;
-                    t += "\" is not a species not handle in moose";
-                    cerr << t<< endl;
-                }
-
-            }
-        }
-        bool rateRule = rule->isRate();
-        if ( rateRule ) {
-            string rule_variable1 = rule->getVariable();
-            cout << "Warning : For now this \"" << rule_variable1 << "\" rate Rule is not handled in moose "  << endl;
-            errorFlag_ = true;
-        }
-        bool  algebRule = rule->isAlgebraic();
-        if ( algebRule ) {
-            string rule_variable1 = rule->getVariable();
-            cout << "Warning: For now this " << rule_variable1 <<" Algebraic Rule is not handled in moose" << endl;
-            errorFlag_ = true;
-        }
-    }
-}
-
-//REACTION
-
-void moose::SbmlReader::createReaction(const map< string, Id > &molSidcmptMIdMap ) {
-    Reaction* reac;
-
-    map< string,double > rctMap;
-    map< string,double >::iterator rctMap_iter;
-    map< string,double >prdMap;
-    map< string,double >::iterator prdMap_iter;
-    map< string,EnzymeInfo >enzInfoMap;
-
-    for ( unsigned int r = 0; r < model_->getNumReactions(); r++ ) {
-        Id reaction_;
-        reac = model_->getReaction( r );
-        noOfsub_ = 0;
-        noOfprd_ = 0;
-        std:: string id; //=reac->getId();
-        if ( reac->isSetId() )
-            id = reac->getId();
-
-        std::string name;
-        if ( reac->isSetName() ) {
-            name = reac->getName();
-            name = nameString(name);
-        }
-        if (name.empty()) {
-            if (id.empty())
-                assert("Reaction id and name is empty");
-            else
-                name = id;
-        }
-        string grpname = getAnnotation( reac,enzInfoMap );
-        string reactionNotes = "";
-       if (reac->isSetNotes())
-        {
-            XMLNode* xnode = reac->getNotes();
-            string testnotes = reac->getNotesString();
-            XMLNode nodec = xnode->getChild(0);
-            XMLNode tnodec = nodec.getChild(0);
-            reactionNotes = tnodec.getCharacters();
-        }
-
-       if ( (grpname != "") && (enzInfoMap[grpname].stage == 3) )
-            setupEnzymaticReaction( enzInfoMap[grpname],grpname ,molSidcmptMIdMap,name, reactionNotes);
-        
-        // if (grpname != "")
-        // {
-        // cout << "\n enz matic reaction " << enzInfoMap[grpname].stage;
-        // setupEnzymaticReaction( enzInfoMap[grpname],grpname ,molSidcmptMIdMap,name);
-        // }
-
-        else if ( grpname == "" ) {
-            if (reac->getNumModifiers() > 0)
-                setupMMEnzymeReaction( reac,id,name ,molSidcmptMIdMap);
-            else {
-                bool rev=reac->getReversible();
-                bool fast=reac->getFast();
-                if ( fast ) {
-                    cout<<"warning: for now fast attribute is not handled"<<endl;
-                    errorFlag_ = true;
-                }
-                // Get groupName if exist in annotation (in case of Genesis)
-                XMLNode * annotationRea = reac->getAnnotation();
-                //string groupName = getAnnotation_Spe_Reac(annotationRea);
-                pair<string, pair<string,string> > group = getAnnotation_Spe_Reac(annotationRea);
-                string groupName = group.first;
-                string xCord = group.second.first;
-                string yCord = group.second.second;
-                
-                int numRcts = reac->getNumReactants();
-                int numPdts = reac->getNumProducts();
-                if (numRcts != 0 )
-                {  // In moose, reactions compartment are decided from first Substrate compartment info
-                   // substrate is missing then I get compartment from Reaction itself.
-                   // Ideally  that reaction should not be created but at this point I am putting a warning that substrate is missing
-                    const SpeciesReference* rect=reac->getReactant(0);
-                    std::string sp=rect->getSpecies();
-                    Id comptRef = molSidcmptMIdMap.find(sp)->second; //gives compartment of sp
-                    Id meshEntry = Neutral::child( comptRef.eref(), "mesh" );
-                    Shell* shell = reinterpret_cast< Shell* >( Id().eref().data() );
-                    string comptPath = Field<string> :: get(comptRef,"path");
-                    string groupString = comptPath+'/'+groupName;
-                    Id groupId;
-                    if (!groupName.empty())
-                    {   groupId = Id( comptPath + "/"+groupName );
-                        if ( groupId == Id() ) 
-                            groupId = shell->doCreate( "Neutral", comptRef, groupName, 1 );
-                        assert( groupId != Id() );
-                    }
-                    if (groupId == Id())
-                        reaction_ = shell->doCreate("Reac", comptRef, name, 1);
-                    else
-                        reaction_ = shell->doCreate("Reac", groupId, name, 1);
-
-                    //shell->doAddMsg( "Single", meshEntry, "remeshReacs", reaction_, "remesh");
-                    //Get Substrate
-                    if (numRcts != 0)
-                        addSubPrd(reac,reaction_,"sub");
-                    //Get Product
-                    if (numPdts != 0)
-                        addSubPrd(reac,reaction_,"prd");
-
-                    if (!xCord.empty() and !yCord.empty())
-                    {  Id reacInfo;
-                        string reacPath = Field<string> :: get(reaction_,"path");
-                        reacInfo = Id( reacPath + "/info");
-                        if ( reacInfo == Id() ) 
-                            reacInfo = shell->doCreate( "Annotator", reaction_, "info", 1 );
-                        assert( reacInfo != Id() );
-                        double x = atof( xCord.c_str() );
-                        double y = atof( yCord.c_str() );
-                        Field< double >::set( reacInfo, "x", x );
-                        Field< double >::set( reacInfo, "y", y );
-                    }
-                    if (!reactionNotes.empty())
-                    {   Id reacInfo;
-                        string reacPath = Field<string> :: get(reaction_,"path");
-                        reacInfo = Id( reacPath + "/info");
-                        if ( reacInfo == Id() ) 
-                            reacInfo = shell->doCreate( "Annotator", reaction_, "info", 1 );
-                        assert( reacInfo != Id() );
-                        reactionNotes.erase(std::remove(reactionNotes.begin(), reactionNotes.end(), '\n'), reactionNotes.end());
-                        reactionNotes.erase(std::remove(reactionNotes.begin(), reactionNotes.end(), '\t'), reactionNotes.end());
-                        Field< string >::set( reacInfo, "notes", reactionNotes );
-                    }
-                    if ( reac->isSetKineticLaw() ) {
-                        KineticLaw * klaw=reac->getKineticLaw();
-                        //vector< double > rate = getKLaw( klaw,rev );
-                        vector< double > rate;
-                        rate.clear();
-                        string amt_Conc;
-                        getKLaw( klaw,rev,rate,amt_Conc );
-                        if ( errorFlag_ )
-                            return;
-                        else if ( !errorFlag_ ) {
-                            if (amt_Conc == "amount")
-                            {   Field < double > :: set( reaction_, "numKf", rate[0] );
-                                Field < double > :: set( reaction_, "numKb", rate[1] );
-                            }
-                            else if (amt_Conc == "concentration")
-                            {   Field < double > :: set( reaction_, "Kf", rate[0] );
-                                Field < double > :: set( reaction_, "Kb", rate[1] );
-                            }
-                            /*if (numRcts > 1)
-                            rate[0] = rate[0]*pow(1e3,1.0);
-                                 Field < double > :: set( reaction_, "Kf", rate[0] );
-                                 Field < double > :: set( reaction_, "Kb", rate[1] );
-                                 */
-                        }
-                    } //issetKineticLaw
-                }
-                else
-                    cout << " Warning:Reaction \" " << name << "\" doesnot have substrate, this is not read into moose"<<endl;
-            } //else
-        } // else grpname == ""
-    }//for unsigned
-} //reaction
-
-/* Enzymatic Reaction  */
-void moose::SbmlReader::setupEnzymaticReaction( const EnzymeInfo & einfo,string enzname, const map< string, Id > &molSidcmptMIdMap,string name1,string enzNotes) {
-    string enzPool = einfo.enzyme;
-    Id comptRef = molSidcmptMIdMap.find(enzPool)->second; //gives compartment of sp
-    Id meshEntry = Neutral::child( comptRef.eref(), "mesh" );
-    Shell* shell = reinterpret_cast< Shell* >( Id().eref().data() );
-    string xCord = einfo.xcord;
-    string yCord = einfo.ycord;
-    //Creating enz pool to enzyme site
-    Id enzPoolId = molSidMIdMap_.find(enzPool)->second;
-
-    string enzparentpath = Field<string> :: get(enzPoolId,"path");
-    Id enzId = Id( enzparentpath + "/"+name1 );
-    
-    Id enzyme_ = shell->doCreate("Enz", enzPoolId, name1, 1);
-    //shell->doAddMsg( "Single", meshEntry, "remeshReacs", enzyme_, "remesh");
-
-    if (enzyme_ != Id())
-    {
-        Id complex = einfo.complex;
-        string clxpath = Field<string> :: get(complex,"path");
-        //Moving enzyme site under enzyme
-        shell->doMove(complex,enzyme_);
-        shell->doAddMsg("OneToAll",enzyme_,"cplx",complex,"reac");
-
-        shell->doAddMsg("OneToOne",enzyme_,"enz",enzPoolId,"reac");
-
-        vector< Id >::const_iterator sub_itr;
-        for ( sub_itr = einfo.substrates.begin(); sub_itr != einfo.substrates.end(); sub_itr++ ) {
-            Id S = (*sub_itr);
-            Id b = shell->doAddMsg( "OneToOne", enzyme_, "sub" ,S , "reac" );
-        }
-
-        vector< Id >::const_iterator prd_itr;
-        for ( prd_itr = einfo.products.begin(); prd_itr != einfo.products.end(); prd_itr++ ) {
-            Id P = (*prd_itr);
-            shell->doAddMsg ("OneToOne",enzyme_,"prd", P,"reac");
-        }
-        // populate k3,k2,k1 in this order only.
-        Field < double > :: set( enzyme_, "k3", einfo.k3 );
-        Field < double > :: set( enzyme_, "k2", einfo.k2 );
-        Field < double > :: set( enzyme_, "k1", einfo.k1 );
-        if (!xCord.empty() and !yCord.empty())
-        {  Id enzInfo;
-            string enzPath = Field<string> :: get(enzyme_,"path");
-            enzInfo = Id( enzPath + "/info");
-            if ( enzInfo == Id() ) 
-                enzInfo = shell->doCreate( "Annotator", enzyme_, "info", 1 );
-            assert( enzInfo != Id() );
-            double x = atof( xCord.c_str() );
-            double y = atof( yCord.c_str() );
-            Field< double >::set( enzInfo, "x", x );
-            Field< double >::set( enzInfo, "y", y );
-        } //xCord.empty
-        if (!enzNotes.empty())
-        {  Id enzInfo;
-            string enzPath = Field<string> :: get(enzyme_,"path");
-            enzInfo = Id( enzPath + "/info");
-            if ( enzInfo == Id() ) 
-                enzInfo = shell->doCreate( "Annotator", enzyme_, "info", 1 );
-            assert( enzInfo != Id() );
-            enzNotes.erase(std::remove(enzNotes.begin(), enzNotes.end(), '\n'), enzNotes.end());
-            enzNotes.erase(std::remove(enzNotes.begin(), enzNotes.end(), '\t'), enzNotes.end());
-            Field< string >::set( enzInfo, "notes", enzNotes );
-        } //xCord.empty
-    }//enzyme_
-}
-
-/*  get annotation  */
-pair<string, pair< string, string> > moose::SbmlReader :: getAnnotation_Spe_Reac(XMLNode * annotationSpe_Rec)
-{   string groupName = "";
-    string xcord = "";
-    string ycord = "";
-    //XMLNode * annotationSpe_Rec = spe_rec->getAnnotation();
-    if( annotationSpe_Rec != NULL ) {
-        unsigned int num_children = annotationSpe_Rec->getNumChildren();
-        for( unsigned int child_no = 0; child_no < num_children; child_no++ ) {
-            XMLNode childNode = annotationSpe_Rec->getChild( child_no );
-            if ( childNode.getPrefix() == "moose" && childNode.getName() == "ModelAnnotation" ) {
-                unsigned int num_gchildren = childNode.getNumChildren();
-                for( unsigned int gchild_no = 0; gchild_no < num_gchildren; gchild_no++ ) {
-                    XMLNode &grandChildNode = childNode.getChild( gchild_no );
-                    string nodeName = grandChildNode.getName();
-                    if (nodeName == "Group")
-                    {   groupName = (grandChildNode.getChild(0).toXMLString()).c_str();
-                        //group = shell->doCreate( "Neutral", mgr, "groups", 1, MooseGlobal );
-                        // assert( group != Id() );
-                    }
-                    else if (nodeName == "xCord")
-                        xcord = (grandChildNode.getChild(0).toXMLString()).c_str();
-                    else if (nodeName == "yCord")
-                        ycord = (grandChildNode.getChild(0).toXMLString()).c_str();
-                    
-                } //gchild
-            } //moose and modelAnnotation
-        } //child
-        }//annotation Node
-    return make_pair(groupName, make_pair(xcord,ycord));
-}
-string moose::SbmlReader::getAnnotation( Reaction* reaction,map<string,EnzymeInfo> &enzInfoMap ) {
-    XMLNode * annotationNode = reaction->getAnnotation();
-    EnzymeInfo einfo;
-    string grpname = "",stage,group;
-    string xcord,ycord;
-
-    if( annotationNode != NULL ) {
-        unsigned int num_children = annotationNode->getNumChildren();
-        for( unsigned int child_no = 0; child_no < num_children; child_no++ ) {
-            XMLNode childNode = annotationNode->getChild( child_no );
-            if ( childNode.getPrefix() == "moose" && childNode.getName() == "EnzymaticReaction" ) {
-                unsigned int num_gchildren = childNode.getNumChildren();
-                for( unsigned int gchild_no = 0; gchild_no < num_gchildren; gchild_no++ ) {
-                    XMLNode &grandChildNode = childNode.getChild( gchild_no );
-                    string nodeName = grandChildNode.getName();
-                    string nodeValue;
-                    if (grandChildNode.getNumChildren() == 1 ) {
-                        nodeValue = grandChildNode.getChild(0).toXMLString();
-
-                    } else {
-                        cout << "Error: expected exactly ONE child of " << nodeName << endl;
-                    }
-                    if ( nodeName == "enzyme" )
-                        einfo.enzyme = molSidMIdMap_.find(nodeValue)->first;
-
-                    else if ( nodeName == "complex" )
-                        einfo.complex=molSidMIdMap_.find(nodeValue)->second;
-
-                    else if ( nodeName == "substrates") {
-                        Id elem = molSidMIdMap_.find(nodeValue)->second;
-                        einfo.substrates.push_back(elem);
-                    } 
-                    else if ( nodeName == "product" ) {
-                        Id elem = molSidMIdMap_.find(nodeValue)->second;
-                        einfo.products.push_back(elem);
-                    }
-                    else if ( nodeName == "groupName" )
-                        grpname = nodeValue;
-                    
-                    else if ( nodeName == "stage" )
-                        stage = nodeValue;
-
-                    else if ( nodeName == "Group" )
-                        einfo.group = nodeValue;
-                    
-                    else if ( nodeName == "xCord" )
-                        einfo.xcord = nodeValue;
-                    
-                    else if ( nodeName == "yCord" )
-                        einfo.ycord = nodeValue;
-                }
-                if ( stage == "1" ) {
-                    enzInfoMap[grpname].substrates = einfo.substrates;
-                    enzInfoMap[grpname].enzyme = einfo.enzyme;
-                    einfo.stage = 1;
-                    enzInfoMap[grpname].stage = einfo.stage;
-                    enzInfoMap[grpname].group = einfo.group;
-                    enzInfoMap[grpname].xcord = einfo.xcord;
-                    enzInfoMap[grpname].ycord = einfo.ycord;
-                    KineticLaw * klaw=reaction->getKineticLaw();
-                    vector< double > rate ;
-                    rate.clear();
-                    string amt_Conc;
-                    getKLaw( klaw,true,rate,amt_Conc );
-                    if ( errorFlag_ )
-                        exit(0);
-                    else if ( !errorFlag_ ) {
-                        enzInfoMap[grpname].k1 = rate[0];
-                        enzInfoMap[grpname].k2 = rate[1];
-                    }
-                }
-                //Stage =='2' means ES* -> E+P;
-                else if ( stage == "2" ) {
-                    enzInfoMap[grpname].complex = einfo.complex;
-                    enzInfoMap[grpname].products = einfo.products;
-                    einfo.stage = 2;
-                    enzInfoMap[grpname].stage += einfo.stage;
-                    KineticLaw * klaw=reaction->getKineticLaw();
-                    vector< double > rate;
-                    rate.clear();
-                    string amt_Conc;
-                    getKLaw( klaw,false,rate,amt_Conc);
-                    if ( errorFlag_ )
-                        exit(0);
-                    else if ( !errorFlag_ )
-                        enzInfoMap[grpname].k3 = rate[0];
-                }
-            }
-        }
-    }
-    return grpname;
-}
-
-/*    set up Michalies Menten reaction  */
-void moose::SbmlReader::setupMMEnzymeReaction( Reaction * reac,string rid,string rname,const map< string, Id > &molSidcmptMIdMap ) {
-    string::size_type loc = rid.find( "_MM_Reaction_" );
-    if( loc != string::npos ) {
-        int strlen = rid.length();
-        rid.erase( loc,strlen-loc );
-    }
-    unsigned int num_modifr = reac->getNumModifiers();
-    // Get groupName if exist in annotation (in case of Genesis)
-    XMLNode * annotationRea = reac->getAnnotation();
-    //string groupName = getAnnotation_Spe_Reac(annotationRea);
-    pair<string, pair<string,string> > group = getAnnotation_Spe_Reac(annotationRea);
-    string groupName = group.first;
-    string xCord = group.second.first;
-    string yCord = group.second.second;
-    string MMEnznotes = " ";
-    if (reac->isSetNotes())
-        {
-            XMLNode* xnode = reac->getNotes();
-            string testnotes = reac->getNotesString();
-            XMLNode nodec = xnode->getChild(0);
-            XMLNode tnodec = nodec.getChild(0);
-            MMEnznotes = tnodec.getCharacters();
-        }
-
-    for ( unsigned int m = 0; m < num_modifr; m++ ) {
-        const ModifierSpeciesReference* modifr=reac->getModifier( m );
-        std::string sp = modifr->getSpecies();
-        Id enzyme_;
-        Id E = molSidMIdMap_.find(sp)->second;
-        string Epath = Field<string> :: get(E,"path");
-        //cout << " \n \n  epath" << Epath;
-        Id comptRef = molSidcmptMIdMap.find(sp)->second; //gives compartment of sp
-        Id meshEntry = Neutral::child( comptRef.eref(), "mesh" );
-        Shell* shell = reinterpret_cast< Shell* >( Id().eref().data() );
-        enzyme_ = shell->doCreate("MMenz",E,rname,1);
-        if (enzyme_ != Id())
-        {   //shell->doAddMsg( "Single", meshEntry, "remeshReacs", enzyme_, "remesh");
-            if (E != Id())
-                shell->doAddMsg("Single",E,"nOut",enzyme_,"enzDest");
-            if (!xCord.empty() and !yCord.empty())
-            {  Id enzInfo;
-                string enzPath = Field<string> :: get(enzyme_,"path");
-                enzInfo = Id( enzPath + "/info");
-                if ( enzInfo == Id() ) 
-                    enzInfo = shell->doCreate( "Annotator", enzyme_, "info", 1 );
-                assert( enzInfo != Id() );
-                double x = atof( xCord.c_str() );
-                double y = atof( yCord.c_str() );
-                Field< double >::set( enzInfo, "x", x );
-                Field< double >::set( enzInfo, "y", y );
-            }
-            if(!MMEnznotes.empty())
-            {
-                Id enzInfo;
-                string enzPath = Field<string> :: get(enzyme_,"path");
-                enzInfo = Id( enzPath + "/info");
-                if ( enzInfo == Id() ) 
-                    enzInfo = shell->doCreate( "Annotator", enzyme_, "info", 1 );
-                assert( enzInfo != Id() );
-                MMEnznotes.erase(std::remove(MMEnznotes.begin(), MMEnznotes.end(), '\n'), MMEnznotes.end());
-                MMEnznotes.erase(std::remove(MMEnznotes.begin(), MMEnznotes.end(), '\t'), MMEnznotes.end());
-                Field< string >::set( enzInfo, "notes", MMEnznotes );
-            }
-
-            KineticLaw * klaw=reac->getKineticLaw();
-            vector< double > rate;
-            rate.clear();
-            string amt_Conc;
-            getKLaw( klaw,true,rate,amt_Conc);
-            if ( errorFlag_ )
-                return;
-            else if ( !errorFlag_ ) {
-                for ( unsigned int rt = 0; rt < reac->getNumReactants(); rt++ ) {
-                    const SpeciesReference* rct = reac->getReactant( rt );
-                    sp=rct->getSpecies();
-                    Id S = molSidMIdMap_.find(sp)->second;
-                    if (S != Id())
-                        shell->doAddMsg( "OneToOne", enzyme_, "sub" ,S , "reac" );
-                }
-                for ( unsigned int pt = 0; pt < reac->getNumProducts(); pt++ ) {
-                    const SpeciesReference* pdt = reac->getProduct(pt);
-                    sp = pdt->getSpecies();
-                    Id P = molSidMIdMap_.find(sp)->second;
-                    if (P != Id())
-                        shell->doAddMsg( "OneToOne", enzyme_, "prd" ,P, "reac" );
-                }
-                Field < double > :: set( enzyme_, "kcat", rate[0] );
-                Field < double > :: set( enzyme_, "numKm", rate[1] );
-            }
-     } //if Enzyme_
-    }
-}
-
-/*    get Parameters from Kinetic Law  */
-void moose::SbmlReader::getParameters( const ASTNode* node,vector <string> & parameters ) {
-    assert( parameters.empty() );
-    //cout << " parameter type " <<node->getType();
-
-    if ( node->getType() == AST_MINUS ) {
-        const ASTNode* lchild = node->getLeftChild();
-        pushParmstoVector( lchild,parameters );
-
-        if ( parameters.size() == 1 ) {
-            const ASTNode* rchild = node->getRightChild();
-            pushParmstoVector( rchild,parameters );
-        }
-    } else if ( node->getType() == AST_DIVIDE ) {
-        const ASTNode* lchild = node->getLeftChild();
-        pushParmstoVector( lchild,parameters );
-        if (( parameters.size() == 1 ) || ( parameters.size() == 0 )) {
-            const ASTNode* rchild = node->getRightChild();
-            pushParmstoVector( rchild,parameters );
-        }
-    } else if ( node->getType() == AST_TIMES ) {
-        //cout << " time " <<endl;
-        pushParmstoVector( node,parameters );
-    } else if ( node->getType() == AST_PLUS )
-        pushParmstoVector( node,parameters );
-    else if ( node->getType() == AST_NAME )
-        pushParmstoVector( node,parameters );
-    if ( parameters.size() > 2 ) {
-        cout<<"Sorry! for now MOOSE cannot handle more than 2 parameters ."<<endl;
-        errorFlag_ = true;
-    }
-
-}
-
-/*   push the Parameters used in Kinetic law to a vector  */
-
-void moose::SbmlReader::pushParmstoVector(const ASTNode* p,vector <string> & parameters) {
-    string parm = "";
-    //cout << "\n there " << p->getType();
-    //cout << "_NAME" << " = " <<p->getName();
-    if ( p->getType() == AST_NAME ) {
-        pvm_iter = parmValueMap.find( std::string(p->getName()) );
-        if ( pvm_iter != parmValueMap.end() ) {
-            parm = pvm_iter->first;
-            parameters.push_back( parm );
-        }
-    }
-    int num = p->getNumChildren();
-    for( int i = 0; i < num; ++i ) {
-        const ASTNode* child = p->getChild(i);
-        pushParmstoVector( child,parameters );
-    }
-}
-
-/*     get Kinetic Law  */
-void moose::SbmlReader::getKLaw( KineticLaw * klaw,bool rev,vector< double > & rate,string &amt_Conc) {
-    std::string id;
-    amt_Conc = "amount";
-    double value = 0.0;
-    UnitDefinition * kfud;
-    UnitDefinition * kbud;
-    int np = klaw->getNumParameters();
-    bool flag = true;
-    for ( int pi = 0; pi < np; pi++ ) {
-        Parameter * p = klaw->getParameter(pi);
-
-        if ( p->isSetId() )
-            id = p->getId();
-        if ( p->isSetValue() )
-            value = p->getValue();
-        parmValueMap[id] = value;
-        flag = false;
-    }
-    double kf = 0.0,kb = 0.0,kfvalue,kbvalue;
-    string kfparm,kbparm;
-    vector< string > parameters;
-    parameters.clear();
-    const ASTNode* astnode=klaw->getMath();
-    //cout << "\nkinetic law is :" << SBML_formulaToString(astnode) << endl;
-    getParameters( astnode,parameters );
-    //cout << "getKLaw " << errorFlag_;
-    if ( errorFlag_ )
-        return;
-    else if ( !errorFlag_ ) {
-        if ( parameters.size() == 1 ) {
-            kfparm = parameters[0];
-            kbparm = parameters[0];
-        } else if ( parameters.size() == 2 ) {
-            kfparm = parameters[0];
-            kbparm = parameters[1];
-        }
-        //cout << "\n parameter "<< parameters.size();
-        //cout << "$$ "<< parmValueMap[kfparm];
-        //cout << " \t \t " << parmValueMap[kbparm];
-
-        kfvalue = parmValueMap[kfparm];
-        kbvalue = parmValueMap[kbparm];
-        Parameter* kfp;
-        Parameter* kbp;
-        if ( flag ) {
-            kfp = model_->getParameter( kfparm );
-            kbp = model_->getParameter( kbparm );
-        } else {
-            kfp = klaw->getParameter( kfparm );
-            kbp = klaw->getParameter( kbparm );
-        }
-        //cout << "\t \n \n" << kfp << " " <<kbp;
-
-        if ( kfp->isSetUnits() ) {
-            kfud = kfp->getDerivedUnitDefinition();
-            //cout << "parameter unit :" << UnitDefinition::printUnits(kfp->getDerivedUnitDefinition())<< endl;
-            //cout << " rate law ";
-            double transkf = transformUnits( 1,kfud ,"substance",true);
-            //cout << " transkf " << transkf<<endl;
-            kf = kfvalue * transkf;
-            kb = 0.0;
-        } 
-        else {
-            double lvalue =1.0;
-            /* If rate units are not defined then trying to get substanceUnit*/
-            if (model_->getNumUnitDefinitions() > 0)
-                lvalue = unitsforRates();
-                //cout << "Substrate units are specified " << lvalue <<endl;
-            /* If neither RateUnits nor substanceUnit is defined, then assuming SubstanceUnit are in mole,
-               so converting mole to millimole
-            */
-            amt_Conc = "concentration";
-            //cout << " rate law ";
-            if (noOfsub_ >1)
-                lvalue /= pow(1e+3,(noOfsub_-1));
-            kf = kfvalue*lvalue;
-            
-        }// !kfp is notset
-        if ( ( kbp->isSetUnits() ) && ( rev ) ) {
-            kbud = kbp->getDerivedUnitDefinition();
-            double transkb = transformUnits( 1,kbud,"substance",true );
-            kb = kbvalue * transkb;
-        }
-        if ( (! kbp->isSetUnits() ) && ( rev ) ) {
-            double lvalue =1.0;
-            /* If rate units are not defined then,trying to get substanceUnit*/
-            if (model_->getNumUnitDefinitions() > 0)
-                lvalue = unitsforRates();
-            /* If neither RateUnits nor substanceUnit is defined, then assuming SubstanceUnit are in mole (concentration)
-               and hasOnlySubstanceUnit =false so converting mole to millimole
-            */
-            if (noOfprd_ >1)
-                lvalue /= pow(1e+3,(noOfprd_-1));
-            kf = kfvalue*lvalue;
-
-        }
-        rate.push_back( kf );
-        rate.push_back( kb );
-    }
-}
-double moose::SbmlReader::unitsforRates() {
-    double lvalue =1;
-    for (unsigned int n=0; n < model_->getNumUnitDefinitions(); n++) {
-        UnitDefinition * ud = model_->getUnitDefinition(n);
-        for (unsigned int ut=0; ut <ud->getNumUnits(); ut++) {
-            Unit * unit = ud->getUnit(ut);
-            if (ud->getId() == "substance") {
-                if ( unit->isMole() ) {
-                    double exponent = unit->getExponent();
-                    double multiplier = unit->getMultiplier();
-                    int scale = unit->getScale();
-                    double offset = unit->getOffset();
-                    lvalue *= pow( multiplier * pow(10.0,scale), exponent ) + offset;
-                    return lvalue;
-                }
-            }
-        }
-    }
-    return lvalue;
-}//unitforRates
-void moose::SbmlReader::addSubPrd(Reaction * reac,Id reaction_,string type) {
-    if (reaction_ != Id())
-    {
-        map< string,double > rctMap;
-        map< string,double >::iterator rctMap_iter;
-        double rctcount=0.0;
-        Shell * shell = reinterpret_cast< Shell* >( Id().eref().data() );
-        rctMap.clear();
-        unsigned int nosubprd;
-        const SpeciesReference* rct;
-        if (type == "sub") {
-            nosubprd = reac->getNumReactants();
-        } else
-            nosubprd = reac->getNumProducts();
-        for ( unsigned int rt=0; rt<nosubprd; rt++ ) {
-            if (type == "sub")
-                rct = reac->getReactant(rt);
-            else
-                rct = reac->getProduct(rt);
-            std:: string sp = rct->getSpecies();
-            rctMap_iter = rctMap.find(sp);
-            if ( rctMap_iter != rctMap.end() )
-                rctcount = rctMap_iter->second;
-            else
-                rctcount = 0.0;
-            rctcount += rct->getStoichiometry();
-            rctMap[sp] = rctcount;
-            if (type =="sub")
-                noOfsub_ +=rctcount;
-            for ( int i=0; (int)i<rct->getStoichiometry(); i++ )
-                shell->doAddMsg( "OneToOne", reaction_, type ,molSidMIdMap_[sp] , "reac" );
-        }
-    }
-}
-/* Transform units from SBML to MOOSE
-   MOOSE units for
-   volume -- cubic meter
-*/
-
-double moose::SbmlReader::transformUnits( double mvalue,UnitDefinition * ud,string type, bool hasonlySubUnit ) {
-    assert (ud);
-    double lvalue = mvalue;
-    if (type == "compartment") 
-    {   if(ud->getNumUnits() == 0)
-            unitsDefined = false;
-        else
-        {   for ( unsigned int ut = 0; ut < ud->getNumUnits(); ut++ ) {
-            Unit * unit = ud->getUnit(ut);
-            if ( unit->isLitre() ) {
-                double exponent = unit->getExponent();
-                double multiplier = unit->getMultiplier();
-                int scale = unit->getScale();
-                double offset = unit->getOffset();
-                lvalue *= pow( multiplier * pow(10.0,scale), exponent ) + offset;
-                // Need to check if spatial dimension is less than 3 then,
-                // then volume conversion e-3 to convert cubicmeter shd not be done.
-                lvalue *= pow(1e-3,exponent);
-                }
-            }
-        }
-    } 
-    else if(type == "substance")
-    {   double exponent = 1.0;
-        if(ud->getNumUnits() == 0)
-            unitsDefined = false;
-        else {
-            for ( unsigned int ut = 0; ut < ud->getNumUnits(); ut++ ) {
-                Unit * unit = ud->getUnit(ut);
-                if ( unit->isMole() ) {
-                    exponent = unit->getExponent();
-                    double multiplier = unit->getMultiplier();
-                    int scale = unit->getScale();
-                    double offset = unit->getOffset();
-                    lvalue *= pow( multiplier * pow(10.0,scale), exponent ) + offset;
-                }//if unit is Mole
-                else if (unit->isItem()){
-                    exponent = unit->getExponent();
-                    double multiplier = unit->getMultiplier();
-                    int scale = unit->getScale();
-                    double offset = unit->getOffset();
-                    lvalue *= pow( multiplier * pow(10.0,scale), exponent ) + offset;
-                }    
-            }//for
-        } //else
-    } // type="substance"
-return lvalue;
-}//transformUnits
-void moose::SbmlReader::printMembers( const ASTNode* p,vector <string> & ruleMembers ) {
-    if ( p->getType() == AST_NAME ) {
-        //cout << "_NAME" << " = " << p->getName() << endl;
-        ruleMembers.push_back( p->getName() );
-    }
-    int num = p->getNumChildren();
-    for( int i = 0; i < num; ++i ) {
-        const ASTNode* child = p->getChild(i);
-        printMembers( child,ruleMembers );
-    }
-}
-
-void moose::SbmlReader ::findModelParent( Id cwe, const string& path, Id& parentId, string& modelName ) {
-    //Taken from LoadModels.cpp
-    //If path exist example /model when come from GUI it creates model under /model/filename
-    // i.e. because by default we creat genesis,sbml models under '/model', which is created before and path exist
-    // at the time it comes to MooseSbmlReader.cpp
-    //When run directly (command line readSBML() )it ignores the path and creates under '/' and filename takes as "SBMLtoMoose"
-    //modelName = "test";
-    string fullPath = path;
-    if ( path.length() == 0 )
-        parentId = cwe;
-
-    if ( path == "/" )
-        parentId = Id();
-
-    if ( path[0] != '/' ) {
-        string temp = cwe.path();
-        if ( temp[temp.length() - 1] == '/' )
-            fullPath = temp + path;
-        else
-            fullPath = temp + "/" + path;
-    }
-    Id paId( fullPath );
-    if ( paId == Id() ) { // Path includes new model name
-        string::size_type pos = fullPath.find_last_of( "/" );
-        assert( pos != string::npos );
-        string head = fullPath.substr( 0, pos );
-        Id ret( head );
-        // When head = "" it means paId should be root.
-        if ( ret == Id() && head != "" && head != "/root" )
-            ;//return 0;
-        parentId = ret;
-        modelName = fullPath.substr( pos + 1 );
-    }
-
-    else { // Path is an existing element.
-        parentId = paId;
-
-    }
-}
-
-/**
- * @brief Populate parmValueMap; a member variable with keeps all the globals
- * parameters of SBML model.
- */
-void moose::SbmlReader::getGlobalParameter() {
-    for ( unsigned int pm = 0; pm < model_->getNumParameters(); pm++ ) {
-        Parameter* prm = model_->getParameter( pm );
-        std::string id,unit;
-        if ( prm->isSetId() ) {
-            id = prm->getId();
-        }
-        double value = 0.0;
-        if ( prm->isSetValue() ) {
-            value=prm->getValue();
-        }
-        parmValueMap[id] = value;
-    }
-}
-
-string moose::SbmlReader::nameString( string str ) {
-    string str1;
-
-    int len = str.length();
-    int i= 0;
-    do {
-        switch( str.at(i) ) {
-        case ' ':
-            str1 = "_space_";
-            str.replace( i,1,str1 );
-            len += str1.length()-1;
-            break;
-        }
-        i++;
-    } while ( i < len );
-    return str;
-}
-#endif // USE_SBML
diff --git a/sbml/MooseSbmlReader.h b/sbml/MooseSbmlReader.h
deleted file mode 100644
index bf97f1ec..00000000
--- a/sbml/MooseSbmlReader.h
+++ /dev/null
@@ -1,86 +0,0 @@
-/*******************************************************************
- * File:            moose::SbmlReader.h
- * Description:
- * Author:          HarshaRani G.V
- * E-mail:          hrani@ncbs.res.in
- ********************************************************************/
-/**********************************************************************
-** This program is part of 'MOOSE', the
-** Messaging Object Oriented Simulation Environment,
-** also known as GENESIS 3 base code.
-**           copyright (C) 2003-2015 Upinder S. Bhalla. and NCBS
-** It is made available under the terms of the
-** GNU Lesser General Public License version 2.1
-** See the file COPYING.LIB for the full notice.
-**********************************************************************/
-
-#ifndef _MOOSESBMLREADER_H
-#define _MOOSESBMLREADER_H
-#ifdef USE_SBML
-
-#include <sbml/SBMLTypes.h>
-
-#include "../basecode/Id.h"
-//class Id;
-typedef struct {
-    string enzyme;
-    Id complex;
-    vector<Id> substrates;
-    vector<Id> products;
-    double k1;
-    double k2;
-    double k3;
-    int stage;
-    string group;
-    string xcord;
-    string ycord;
-} EnzymeInfo;
-
-namespace moose{
-class SbmlReader {
-public:
-    SbmlReader() {
-        errorFlag_ = false;
-    }
-    ~SbmlReader() {
-        ;
-    }
-    Id read(string filename,string location,string solverClass);
-    map< string, Id> createCompartment(string location,Id parentId,string modelName,Id base_);
-    typedef  map<string, Id> sbmlStr_mooseId;
-    //typedef  map< string
-    //                                    , tuple<string,double,bool>
-    //                                    > sbmlId_convUnit;
-    const sbmlStr_mooseId createMolecule(map<string,Id> &);
-    void  createReaction(const sbmlStr_mooseId &);
-
-private:
-    bool errorFlag_;
-    //sbmlId_convUnit poolMap_;
-    Model* model_;
-    SBMLDocument* document_;
-    /* SBMLReader reader_; */
-    map< string, Id >molSidMIdMap_;
-    int noOfsub_;
-    int noOfprd_;
-    Id baseId;
-    double transformUnits( double msize,UnitDefinition * ud,string type,bool hasonlySubUnit );
-    double unitsforRates();
-    void getRules();
-    string nameString( string str );
-    void printMembers( const ASTNode* p,vector <string> & ruleMembers );
-    void addSubPrd(Reaction * reac,Id reaction_,string type);
-    void getKLaw( KineticLaw * klaw,bool rev,vector< double > & rate ,string & amt_Conc);
-    void pushParmstoVector( const ASTNode* p,vector <string> & parameters );
-    void getParameters( const ASTNode* node,vector <string> & parameters );
-    void setupMMEnzymeReaction( Reaction * reac,string id ,string name,const map<string, Id> &);
-    pair<string, pair<string, string> > getAnnotation_Spe_Reac( XMLNode * annotationSpe_Rec );
-    string getAnnotation( Reaction* reaction,map<string,EnzymeInfo> & );
-    void setupEnzymaticReaction( const EnzymeInfo & einfo,string name,const map< string, Id > & ,string name1,string notes);
-    void findModelParent( Id cwe, const string& path,Id& parentId, string& modelName );
-    void getGlobalParameter();
-};
-} // namespace moose
-#endif //USE_SBML
-#endif // _MOOSESBMLREADER_H
-
diff --git a/sbml/MooseSbmlWriter.cpp b/sbml/MooseSbmlWriter.cpp
deleted file mode 100644
index 0fca788b..00000000
--- a/sbml/MooseSbmlWriter.cpp
+++ /dev/null
@@ -1,1137 +0,0 @@
-/*******************************************************************
- * File:            MooseSbmlWriter.cpp
- * Description:      
- * Author:       	HarshaRani   
- * E-mail:          hrani@ncbs.res.in
- ********************************************************************/
-/**********************************************************************
-** This program is part of 'MOOSE', the
-** Messaging Object Oriented Simulation Environment,
-** also known as GENESIS 3 base code.
-**  copyright (C) 2003-2015 Upinder S. Bhalla. and NCBS
-** It is made available under the terms of the
-** GNU Lesser General Public License version 2.1
-** See the file COPYING.LIB for the full notice.
-**********************************************************************/
-#ifdef USE_SBML
-
-#include "header.h"
-#ifdef USE_SBML
-#include <sbml/SBMLTypes.h>
-#endif
-#include "MooseSbmlWriter.h"
-#include "../shell/Wildcard.h"
-#include "../shell/Neutral.h"
-#include <set>
-#include <sstream>
-#include "../shell/Shell.h"
-#include "../kinetics/lookupVolumeFromMesh.h"
-//#include "../manager/SimManager.h"
-/**
-*  write a Model after validation
-*/
-/* ToDo: Tables should be implemented
-		 Assignment rule is assumed to be addition since genesis file has just sumtotal
-		 But now "Function" function class exist so any arbitarary function can be read and written
- */
-
-using namespace std;
-
-int moose::SbmlWriter::write( string target , string filepath)
-{
-  //cout << "Sbml Writer: " << filepath << " ---- " << target << endl;
-#ifdef USE_SBML
-  string::size_type loc;
-  while ( ( loc = filepath.find( "\\" ) ) != string::npos ) 
-    {
-      filepath.replace( loc, 1, "/" );
-    }
-  if ( filepath[0]== '~' )
-    {
-      cerr << "Error : Replace ~ with absolute path " << endl;
-    }
-  string filename = filepath;
-  string::size_type last_slashpos = filename.find_last_of("/");
-  filename.erase( 0,last_slashpos + 1 );  
-  
-  //* Check:  I have to comeback to this and check what to do with file like text.xml2 cases and also shd keep an eye on special char **/
-  vector< string > fileextensions;
-  fileextensions.push_back( ".xml" );
-  fileextensions.push_back( ".zip" );
-  fileextensions.push_back( ".bz2" );
-  fileextensions.push_back( ".gz" );
-  vector< string >::iterator i;
-  for( i = fileextensions.begin(); i != fileextensions.end(); i++ ) 
-    {
-      string::size_type loc = filename.find( *i );
-      if ( loc != string::npos ) 
-	{
-	  int strlen = filename.length(); 
-	  filename.erase( loc,strlen-loc );
-	  break;
-	}
-    }
-  if ( i == fileextensions.end() && filename.find( "." ) != string::npos )
-    {
-      string::size_type loc;
-      while ( ( loc = filename.find( "." ) ) != string::npos ) 
-	{
-	  filename.replace( loc, 1, "_" );
-	}
-    }
-  
-  if ( i == fileextensions.end() )
-    filepath += ".xml";
-  //std::pair<int, std::string > infoValue();
-  SBMLDocument sbmlDoc;
-  bool SBMLok = false;
-  createModel( filename,sbmlDoc,target );
-  //cout << " sbmlDoc " <<sbmlDoc.toSBML()<<endl;
-  SBMLok  = validateModel( &sbmlDoc );
-  if ( SBMLok ) 
-  {
-    return writeModel( &sbmlDoc, filepath );
-  }
-  //delete sbmlDoc;
-  if ( !SBMLok ) {
-    cerr << "Errors encountered " << endl;
-    return -1;
-  }
-  
-#endif     
-  return -2;
-}
-
-/*
-1   - Write successful
-0   - Write failed
--1  - Validation failed
--2  - No libsbml support
-*/
-#ifdef USE_SBML
-
-//* Check : first will put a model in according to l3v1 with libsbml5.9.0 **/
-
-//* Create an SBML model in the SBML Level 3 version 1 specification **/
-void moose::SbmlWriter::createModel(string filename,SBMLDocument& sbmlDoc,string path)
-{ 
-  XMLNamespaces  xmlns;
-  xmlns.add("http://www.sbml.org/sbml/level3/version1");
-  xmlns.add("http://www.moose.ncbs.res.in","moose");
-  xmlns.add("http://www.w3.org/1999/xhtml","xhtml");
-  sbmlDoc.setNamespaces(&xmlns);
-  cremodel_ = sbmlDoc.createModel();
-  cremodel_->setId(filename);
-  cremodel_->setTimeUnits("second");
-  cremodel_->setExtentUnits("substance");
-  cremodel_->setSubstanceUnits("substance");
-  
-  Id baseId(path);
-    vector< ObjId > graphs;
-  string plots;
-  wildcardFind(path+"/##[TYPE=Table2]",graphs);
-  
-  for ( vector< ObjId >::iterator itrgrp = graphs.begin(); itrgrp != graphs.end();itrgrp++)
-    {  
-    	vector< ObjId > msgMgrs = 
-                Field< vector< ObjId > >::get( *itrgrp, "msgOut" );
-      	for (unsigned int i = 0; i < msgMgrs.size();++i)
-      	{
-      		Id msgId = Field< Id >::get( msgMgrs[i], "e2" );
-   			string msgpath = Field <string> :: get(msgId,"path");
-   			string msgname = Field <string> :: get(msgId,"name");
-   			string Clean_msgname = nameString(msgname);
-   			ObjId compartment(msgpath);
-   			//starting with compartment Level
-			while( Field<string>::get(compartment,"className") != "CubeMesh"
-		 		&& Field<string>::get(compartment,"className") != "CylMesh"
-		 		)
-		 		compartment = Field<ObjId> :: get(compartment, "parent");
-			string cmpt	 = Field < string > :: get(compartment,"name");
-			std::size_t found = msgpath.find(cmpt);
-			string path = msgpath.substr(found-1,msgpath.length());
-			std::size_t found1 = path.rfind(msgname);
-			//Replacing the clean_name like "." is replace _dot_ etc
-			string plotpath = path.replace(found1,path.length(),Clean_msgname);
-			plots += plotpath+";";
-   			
-      	}
-    }
-  
-  ostringstream modelAnno;
-  modelAnno << "<moose:ModelAnnotation>\n";
-  /*modelAnno << "<moose:runTime> " << runtime << " </moose:runTime>\n";
-  modelAnno << "<moose:simdt> " << simdt << " </moose:simdt>\n";
-  modelAnno << "<moose:plotdt> " << plotdt << " </moose:plotdt>\n";
-  */
-  //cout << " runtime " << runtime << " " << simdt << " "<<plotdt;
-  modelAnno << "<moose:plots> "<< plots<< "</moose:plots>\n";
-  modelAnno << "</moose:ModelAnnotation>";
-  XMLNode* xnode =XMLNode::convertStringToXMLNode( modelAnno.str() ,&xmlns);
-  cremodel_->setAnnotation( xnode );	
-  
-  UnitDefinition *ud1 = cremodel_->createUnitDefinition();
-  ud1->setId("volume");
-  Unit * u = ud1->createUnit();
-  u->setKind(UNIT_KIND_LITRE);
-  u->setMultiplier(1.0);
-  u->setExponent(1.0);
-  u->setScale(0);
-  
-  UnitDefinition * unitdef;
-  Unit* unit;
-  unitdef = cremodel_->createUnitDefinition();
-  unitdef->setId("substance");
-  unit = unitdef->createUnit();
-  unit->setKind( UNIT_KIND_ITEM );
-  unit->setMultiplier(1);
-  unit->setExponent(1.0);
-  unit->setScale(0);
-
-  //Getting Compartment from moose
-  vector< ObjId > chemCompt;
-  wildcardFind(path+"/##[ISA=ChemCompt]",chemCompt);
-  for ( vector< ObjId >::iterator itr = chemCompt.begin(); itr != chemCompt.end();itr++)
-  { 
-      //TODO: CHECK need to check how do I get ObjectDimensions
-      vector < unsigned int>dims;
-      unsigned int dims_size;
-
-      /*vector <unsigned int>dims = Field <vector <unsigned int> > :: get(ObjId(*itr),"objectDimensions");
-
-        if (dims.size() == 0){
-        dims_size = 1;
-        }
-        if (dims.size()>0){ 
-        dims_size= dims.size();
-        }
-        */
-      dims_size = 1;
-
-      for (unsigned index = 0; index < dims_size; ++index)
-      { 
-          string comptName = Field<string>::get(*itr,"name");
-          string comptPath = Field<string>::get(*itr,"path");
-          ostringstream cid;
-          cid << *itr  << "_" << index;
-          comptName = nameString(comptName);
-          string comptname_id = comptName + "_" + cid.str() + "_";
-          //changeName(comptname,cid.str());
-          string clean_comptname = idBeginWith(comptname_id);
-          double size = Field<double>::get(ObjId(*itr,index),"Volume");
-          unsigned int ndim = Field<unsigned int>::get(ObjId(*itr,index),"NumDimensions");
-
-          ::Compartment* compt = cremodel_->createCompartment();
-          compt->setId(clean_comptname);
-          compt->setName(comptName);
-          compt->setSpatialDimensions(ndim);
-          compt->setUnits("volume");
-          compt->setConstant(true);
-          if(ndim == 3)
-              // Unit for Compartment in moose is cubic meter
-              //   Unit for Compartment in SBML is lts 
-              //   so multiple by 1000                  
-              compt->setSize(size*1e3);
-          else
-              compt->setSize(size);
-
-          // All the pools are taken here 
-          vector< ObjId > Compt_spe;
-          wildcardFind(comptPath+"/##[ISA=PoolBase]",Compt_spe);
-
-          //vector< Id > Compt_spe = LookupField< string, vector< Id > >::get(*itr, "neighbors", "remesh" );
-          int species_size = 1;
-          string objname;
-          for (vector <ObjId> :: iterator itrp = Compt_spe.begin();itrp != Compt_spe.end();itrp++)
-          { string objclass = Field<string> :: get(*itrp,"className");
-              string clean_poolname = cleanNameId(*itrp,index);
-              double initAmt = Field<double> :: get(*itrp,"nInit");
-              Species *sp = cremodel_->createSpecies();
-              sp->setId( clean_poolname );
-              string poolname = Field<string> :: get(*itrp,"name");
-              std::size_t found = poolname.find("cplx");
-              //If Species name has cplx then assuming its cplx molecule, since genesis there can be same cplx
-              // name in number place, as its build under site, but in SBML this is not possible
-              // so adding enzsite_enzname_cplxname
-              if (found!=std::string::npos)
-              {	vector < Id > rct = LookupField <string,vector < Id> >::get(*itrp, "neighbors","reacDest");
-                  std::set < Id > rctprd;
-                  rctprd.insert(rct.begin(),rct.end());
-                  for (std::set < Id> :: iterator rRctPrd = rctprd.begin();rRctPrd!=rctprd.end();rRctPrd++)
-                  { 
-                      string enz = Field<string> :: get(*rRctPrd,"name");
-                      ObjId meshParent = Neutral::parent( rRctPrd->eref() );
-                      string enzPoolsite = Field<string>::get(ObjId(meshParent),"name") ;
-                      objname = nameString(enzPoolsite)+"_"+nameString(enz)+"_"+nameString(poolname);
-                  }
-              }
-              else
-                  objname = nameString(poolname);
-
-              sp->setName( objname);
-              sp->setCompartment( clean_comptname );
-              // AS of 12-6-2013
-              //   Units in moose for  pool : is milli molar,    In sbml for pool: default unit is mole.
-              //   so 'conc' shd be converted to from milli molar to mole
-              //   molar (M) = moles/ltrs
-              //   mole =  Molar*1e-3*Vol*1000*Avogadro constant (1e-3 milli, 1000 is for converting cubicmeter to lts)
-              //   As of 2-7-2013
-              //   At this point I am using item which is # number so directly using nInit
-              //
-              sp->setInitialAmount( initAmt ); 
-
-              string groupName = getGroupinfo(*itrp);
-              string path = Field<string> :: get(*itrp,"path");
-              ostringstream speciAnno,speciGpname,specixyCord;
-              bool speciannoexist = false;
-              double x,y;
-              Id annotaIdS( path+"/info");
-
-              if (annotaIdS != Id())
-              {   string noteClass = Field<string> :: get(annotaIdS,"className");
-                  x = Field <double> :: get(annotaIdS,"x");
-                  y = Field <double> :: get(annotaIdS,"y");
-                  // specixyCord << "<moose:xCord>" << x << "</moose:xCord>\n";
-                  // specixyCord << "<moose:yCord>" << y << "</moose:yCord>\n";
-                  // speciannoexist = true;
-
-                  string notes;
-                  if (noteClass =="Annotator")
-                  { string notes = Field <string> :: get(annotaIdS,"notes");
-                      if (notes != "")	
-                      { 	string cleanNotes = nameString1(notes);
-                          string notesString = "<body xmlns=\"http://www.w3.org/1999/xhtml\">\n \t \t"+
-                              cleanNotes + "\n\t </body>"; 
-                          sp->setNotes(notesString);
-                      }
-                  }
-              }
-              sp->setUnits("substance");
-              species_size = species_size+1;
-              // true if species is amount, false: if species is concentration 
-              sp->setHasOnlySubstanceUnits( true );
-              if (!groupName.empty())
-              {	speciGpname << "<moose:Group>"<< groupName << "</moose:Group>\n";
-                  speciannoexist = true;
-              }
-              if (speciannoexist)
-              {		ostringstream speciAnno;
-                  speciAnno << "<moose:ModelAnnotation>\n";
-                  if (!speciGpname.str().empty())
-                      speciAnno << speciGpname.str();
-                  if (!specixyCord.str().empty())
-                      speciAnno <<specixyCord.str();
-                  speciAnno << "</moose:ModelAnnotation>";
-                  XMLNode* xnode =XMLNode::convertStringToXMLNode( speciAnno.str() ,&xmlns);
-                  sp->setAnnotation( xnode );
-              }
-
-              // Setting BoundaryCondition and constant as per this rule for BufPool
-              //  -constanst  -boundaryCondition  -has assignment/rate Rule  -can be part of sub/prd
-              //	false	        true              yes                       yes   
-              //    true            true               no                       yes
-              if ( (objclass == "BufPool") || (objclass == "ZombieBufPool"))
-              {	 //BoundaryCondition is made for buff pool
-                  sp->setBoundaryCondition(true);
-                  string Funcpoolname = Field<string> :: get(*itrp,"path");
-                  vector< Id > children = Field< vector< Id > >::get( *itrp, "children" );
-
-                  if (children.size() == 0)
-                      sp->setConstant( true );
-                    
-                  for ( vector< Id >::iterator i = children.begin(); i != children.end(); ++i ) 
-                  {	string funcpath = Field <string> :: get(*i,"path");
-                      string clsname = Field <string> :: get(*i,"className");
-                      if (clsname == "Function" or clsname == "ZombieFunction")
-                      {  	Id funcIdx(funcpath+"/x");
-                          string f1x = Field <string> :: get(funcIdx,"path");
-                          vector < Id > inputPool = LookupField <string,vector <Id> > :: get(funcIdx,"neighbors","input");
-                          int sumtot_count = inputPool.size();
-                          if (sumtot_count > 0)
-                          {	sp->setConstant(false);
-                              ostringstream sumtotal_formula;
-                              for(vector< Id >::iterator itrsumfunc = inputPool.begin();itrsumfunc != inputPool.end(); itrsumfunc++)
-                              {
-                                  string sumpool = Field<string> :: get(*itrsumfunc,"name");
-                                  sumtot_count -= 1;
-                                  string clean_sumFuncname = cleanNameId(*itrsumfunc,index);
-                                  if ( sumtot_count == 0 )
-                                      sumtotal_formula << clean_sumFuncname;
-                                  else
-                                      sumtotal_formula << clean_sumFuncname << "+";
-                              }
-                              Rule * rule =  cremodel_->createAssignmentRule();
-                              rule->setVariable( clean_poolname );
-                              rule->setFormula( sumtotal_formula.str() );
-                          }
-                          else
-                              cout << "assignment rule is not found for \" "<< poolname << " \" but function might have constant as variable"<<endl;
-                      }
-                      //If assignment rule is not found then constant is made true
-                      else
-                          sp->setConstant(true);
-
-                  } //for vector
-              }//zbufpool
-              else
-              {  //if not bufpool then Pool, then 
-                  sp->setBoundaryCondition(false);
-                  sp->setConstant(false);
-              }
-          } //itrp
-
-          vector< ObjId > Compt_Reac;
-          wildcardFind(comptPath+"/##[ISA=ReacBase]",Compt_Reac);
-          //vector< Id > Compt_Reac = LookupField< string, vector< Id > >::get(*itr, "neighbors", "remeshReacs" );
-          for (vector <ObjId> :: iterator itrR= Compt_Reac.begin();itrR != Compt_Reac.end();itrR++)
-          { 
-              Reaction* reaction;
-              reaction = cremodel_->createReaction(); 
-              string cleanReacname = cleanNameId(*itrR,index);
-              string recClass = Field<string> :: get(*itrR,"className");
-              string pathR = Field<string> :: get(*itrR,"path");
-              ostringstream reactAnno,reactGpname,reactxyCord;
-              bool reactannoexist = false;
-              string noteClassR;
-              double x,y;
-              Id annotaIdR(pathR+"/info");
-              if (annotaIdR != Id())
-              {
-                  noteClassR = Field<string> :: get(annotaIdR,"className");
-                  x = Field <double> :: get(annotaIdR,"x");
-                  y = Field <double> :: get(annotaIdR,"y");
-                  // reactxyCord << "<moose:xCord>" << x << "</moose:xCord>\n";
-                  // reactxyCord << "<moose:yCord>" << y << "</moose:yCord>\n";
-                  // reactannoexist = true;
-                  string notesR;
-                  if (noteClassR =="Annotator")
-                  {
-                      notesR = Field <string> :: get(annotaIdR,"notes");
-                      if (notesR != "")
-                      {	string cleanNotesR = nameString1(notesR);
-                          string notesStringR = "<body xmlns=\"http://www.w3.org/1999/xhtml\">\n \t \t"+
-                              cleanNotesR + "\n\t </body>"; 
-                          reaction->setNotes(notesStringR);
-                      }
-                  }
-              }	
-
-              string groupName = getGroupinfo(*itrR);
-              if (!groupName.empty())
-              {	reactGpname << "<moose:Group>"<< groupName << "</moose:Group>\n";
-                  reactannoexist = true;
-              }
-              if (reactannoexist)
-              {		ostringstream reactAnno;
-                  reactAnno << "<moose:ModelAnnotation>\n";
-                  if (!reactGpname.str().empty())
-                      reactAnno << reactGpname.str();
-                  if (!reactxyCord.str().empty())
-                      reactAnno <<reactxyCord.str();
-                  reactAnno << "</moose:ModelAnnotation>";
-                  XMLNode* xnode =XMLNode::convertStringToXMLNode( reactAnno.str() ,&xmlns);
-                  reaction->setAnnotation( xnode );
-              }	
-
-              string objname = Field<string> :: get(*itrR,"name");
-              objname = nameString(objname);
-
-              KineticLaw* kl;
-              // Reaction 
-              reaction->setId( cleanReacname);
-              reaction->setName( objname);
-              double Kf = Field<double>::get(*itrR,"numKf");
-              double Kb = Field<double>::get(*itrR,"numKb");
-
-              if (Kb == 0.0)
-                  reaction->setReversible( false );
-              else
-                  reaction->setReversible( true );
-              reaction->setFast( false );
-              // Reaction's Reactant are Written 
-              ostringstream rate_law,kfparm,kbparm;
-              double rct_order = 0.0;
-              kfparm << cleanReacname << "_" << "Kf";
-              rate_law << kfparm.str();
-
-              // This function print out reactants and update rate_law string 
-              getSubPrd(reaction,"sub","",*itrR,index,rate_law,rct_order,true,recClass);
-              double prd_order =0.0;
-              kbparm << cleanReacname << "_" << "Kb";
-
-              // This function print out product and update rate_law string  if kb != 0 
-              if ( Kb != 0.0 ){
-                  rate_law << "-" << kbparm.str();
-                  getSubPrd(reaction,"prd","",*itrR,index,rate_law,prd_order,true,recClass);
-              }
-              else
-                  getSubPrd(reaction,"prd","",*itrR,index,rate_law,prd_order,false,recClass);
-
-              kl = reaction->createKineticLaw();
-              kl->setFormula( rate_law.str() );
-              double rvalue,pvalue;
-              rvalue = Kf;
-              string unit=parmUnit( rct_order-1 );
-              printParameters( kl,kfparm.str(),rvalue,unit ); 
-              if ( Kb != 0.0 ){
-                  pvalue = Kb;
-                  string unit=parmUnit( prd_order-1 );
-                  printParameters( kl,kbparm.str(),pvalue,unit ); 
-              }
-          }//itrR
-          //     Reaction End 
-
-          // Enzyme start
-          vector< ObjId > Compt_Enz;
-          wildcardFind(comptPath+"/##[ISA=EnzBase]",Compt_Enz);
-          for (vector <ObjId> :: iterator itrE=Compt_Enz.begin();itrE != Compt_Enz.end();itrE++)
-          { string enzClass = Field<string>::get(*itrE,"className");
-              string cleanEnzname = cleanNameId(*itrE,index);
-              string pathE = Field < string > :: get(*itrE,"Path");
-              Reaction* reaction;
-              reaction = cremodel_->createReaction();
-              ostringstream enzAnno,enzGpname,enzxyCord;
-              string noteClassE;
-              string notesE;
-              double x,y;
-              bool enzAnnoexist = false;
-              Id annotaIdE(pathE+"/info");
-              if (annotaIdE != Id())
-              {
-                  noteClassE = Field<string> :: get(annotaIdE,"className");
-                  x = Field <double> :: get(annotaIdE,"x");
-                  y = Field <double> :: get(annotaIdE,"y");
-                  // enzxyCord << "<moose:xCord>" << x << "</moose:xCord>\n";
-                  // enzxyCord << "<moose:yCord>" << y << "</moose:yCord>\n";
-                  // enzAnnoexist = true;
-                  if (noteClassE =="Annotator")
-                  {
-                      notesE = Field <string> :: get(annotaIdE,"notes");
-                      if (notesE != "")
-                      {	string cleanNotesE = nameString1(notesE);
-                          string notesStringE = "<body xmlns=\"http://www.w3.org/1999/xhtml\">\n \t \t"+
-                              cleanNotesE + "\n\t </body>"; 
-                          reaction->setNotes(notesStringE);
-                      }
-                  }
-              }	
-
-              string groupName = getGroupinfo(*itrE);
-              if (!groupName.empty())
-              {	enzGpname << "<moose:Group>"<< groupName << "</moose:Group>\n";
-                  enzAnnoexist = true;
-              }
-              string objname = Field < string> :: get(*itrE,"name");
-              objname = nameString(objname);
-              KineticLaw* kl;
-              if ( (enzClass == "Enz") || (enzClass == "ZombieEnz"))
-              {// Complex Formation S+E -> SE*;
-                  reaction->setId( cleanEnzname);
-                  reaction->setName( objname);
-                  reaction->setFast ( false );
-                  reaction->setReversible( true);
-                  string enzname = Field<string> :: get(*itrE,"name");
-                  ostringstream enzid;
-                  enzid << (*itrE) <<"_"<<index;
-                  enzname = nameString(enzname);
-                  string enzNameAnno = enzname;
-                  ostringstream Objid;
-                  Objid << (*itrE) <<"_"<<index <<"_";
-                  string enzName = enzname + "_" + Objid.str();
-                  enzName = idBeginWith( enzName );
-                  ostringstream enzAnno;
-                  enzAnno <<"<moose:EnzymaticReaction>";
-                  string groupName = getGroupinfo(*itrE);
-                  // if (!groupName.empty())
-                  // 	enzAnno << "<moose:Group>"<<groupName<<"</moose:Group>";
-                  double k1 = Field<double>::get(*itrE,"k1");
-                  double k2 = Field<double>::get(*itrE,"k2");
-                  ostringstream rate_law;
-                  double rct_order = 0.0,prd_order=0.0;
-                  rate_law << "k1";
-                  getSubPrd(reaction,"enzOut","sub",*itrE,index,rate_law,rct_order,true,enzClass);
-                  for(unsigned int i =0;i<nameList_.size();i++)
-                      enzAnno << "<moose:enzyme>"<<nameList_[i]<<"</moose:enzyme>\n";
-                  getSubPrd(reaction,"sub","",*itrE,index,rate_law,rct_order,true,enzClass);
-                  for (unsigned int i =0;i<nameList_.size();i++)
-                      enzAnno << "<moose:substrates>"<<nameList_[i]<<"</moose:substrates>\n";
-                  // product 
-                  rate_law << "-" << "k2";
-                  getSubPrd(reaction,"cplxDest","prd",*itrE,index,rate_law,prd_order,true,enzClass);
-                  for(unsigned int i =0;i<nameList_.size();i++)
-                      enzAnno << "<moose:product>"<<nameList_[i]<<"</moose:product>\n";
-
-                  enzAnno <<"<moose:groupName>"<<enzName<<"</moose:groupName>\n";
-                  enzAnno << "<moose:stage>1</moose:stage> \n";
-                  // if (!enzGpname.str().empty())
-                  // enzAnno << enzGpname.str();
-                  if (!enzxyCord.str().empty())
-                      enzAnno <<enzxyCord.str();
-                  enzAnno << "</moose:EnzymaticReaction>";
-
-                  XMLNode* xnode =XMLNode::convertStringToXMLNode( enzAnno.str() ,&xmlns);
-                  reaction->setAnnotation( xnode );	
-                  kl = reaction->createKineticLaw();
-                  kl->setFormula( rate_law.str() );
-                  string unit=parmUnit( rct_order-1 );
-                  printParameters( kl,"k1",k1,unit ); 
-                  string punit=parmUnit( prd_order-1 );
-                  printParameters( kl,"k2",k2,punit ); 
-                  // 2 Stage SE* -> E+P  
-
-                  Objid << "Product_formation";
-                  string enzName1 = enzname + "_" + Objid.str() + "_";
-                  enzName1 = idBeginWith( enzName1 );
-                  //Reaction* reaction;
-                  reaction = cremodel_->createReaction(); 
-                  reaction->setId( enzName1 );
-                  reaction->setName( objname);
-                  reaction->setFast( false );
-                  reaction->setReversible( false );
-                  if (notesE != ""){
-                      string cleanNotesE = nameString1(notesE);
-                      string notesStringE = "<body xmlns=\"http://www.w3.org/1999/xhtml\">\n \t \t"+
-                          cleanNotesE + "\n\t </body>"; 
-                      reaction->setNotes(notesStringE);
-                  }
-                  double k3 = Field<double>::get(*itrE,"k3");
-                  double erct_order = 0.0,eprd_order = 0.0;
-                  ostringstream enzrate_law;
-                  enzrate_law << "k3";
-                  string enzAnno2 = "<moose:EnzymaticReaction>";
-
-                  getSubPrd(reaction,"cplxDest","sub",*itrE,index,enzrate_law,erct_order,true,enzClass);
-                  for(unsigned int i =0;i<nameList_.size();i++)
-                      enzAnno2 +=  "<moose:complex>"+nameList_[i]+"</moose:complex>\n";
-
-                  getSubPrd(reaction,"enzOut","prd",*itrE,index,enzrate_law,eprd_order,false,enzClass);
-                  for(unsigned int i =0;i<nameList_.size();i++)
-                      enzAnno2 += "<moose:enzyme>"+nameList_[i]+"</moose:enzyme>\n";
-
-                  getSubPrd(reaction,"prd","",*itrE,index,enzrate_law,eprd_order,false,enzClass);
-                  for(unsigned int i =0;i<nameList_.size();i++)
-                      enzAnno2 += "<moose:product>"+nameList_[i]+"</moose:product>\n";
-
-                  enzAnno2 += "<moose:groupName>"+enzName+"</moose:groupName>\n";
-                  enzAnno2 += "<moose:stage>2</moose:stage> \n";
-                  enzAnno2 += "</moose:EnzymaticReaction>";
-                  XMLNode* xnode2 =XMLNode::convertStringToXMLNode( enzAnno2 ,&xmlns);
-                  reaction->setAnnotation( xnode2 );	
-                  kl = reaction->createKineticLaw();
-                  kl->setFormula( enzrate_law.str() );
-                  printParameters( kl,"k3",k3,"per_second" );
-              } //enzclass = Enz
-              else if ( (enzClass == "MMenz") || (enzClass == "ZombieMMenz"))
-              { 
-                  reaction->setId( cleanEnzname);
-                  reaction->setName( objname);
-                  string groupName = getGroupinfo(*itrE);
-                  /*
-                     if (enzAnnoexist)
-                     {	ostringstream MMenzAnno;
-                     MMenzAnno << "<moose:ModelAnnotation>\n";
-                  // 	if (!enzGpname.str().empty())
-                  // MMenzAnno << enzGpname.str();
-                  if (!enzxyCord.str().empty())
-                  MMenzAnno <<enzxyCord.str();
-                  MMenzAnno << "</moose:ModelAnnotation>";
-                  XMLNode* xnode =XMLNode::convertStringToXMLNode( MMenzAnno.str() ,&xmlns);
-                  reaction->setAnnotation( xnode );
-                  }
-                  */	
-                  double Km = Field<double>::get(*itrE,"numKm");
-                  double kcat = Field<double>::get(*itrE,"kcat");
-                  reaction->setReversible( false );
-                  reaction->setFast( false );
-                  // Substrate 
-                  ostringstream rate_law,sRate_law,fRate_law;
-                  double rct_order = 0.0,prd_order=0.0;
-                  getSubPrd(reaction,"sub","",*itrE,index,rate_law,rct_order,true,enzClass);
-                  sRate_law << rate_law.str();
-                  // Modifier 
-                  getSubPrd(reaction,"enzDest","",*itrE,index,rate_law,rct_order,true,enzClass);
-                  // product 
-                  getSubPrd(reaction,"prd","",*itrE,index,rate_law,prd_order,false,enzClass);
-                  kl = reaction->createKineticLaw();
-                  string s = sRate_law.str();
-                  if(!s.empty()) {
-                      s = s.substr(1); 
-                  } 
-                  fRate_law << "kcat" << rate_law.str() << "/" << "(" << "Km" << " +" << s << ")"<<endl;
-                  kl->setFormula( fRate_law.str() );
-                  kl->setNotes("<body xmlns=\"http://www.w3.org/1999/xhtml\">\n\t\t" + fRate_law.str() + "\n\t </body>");
-                  printParameters( kl,"Km",Km,"substance" ); 
-                  string kcatUnit = parmUnit( 0 );
-                  printParameters( kl,"kcat",kcat,kcatUnit );
-              } //enzClass == "MMenz"
-
-          } //itrE
-          // Enzyme End
-      }//index
-  }//itr
-  //return sbmlDoc;
-
-}//createModel
-#endif
-
-/** Writes the given SBMLDocument to the given file. **/
-
-string moose::SbmlWriter :: getGroupinfo(Id itr)
-{   //#Harsha: Note: At this time I am assuming that if group exist 
-  	//  1. for 'pool' its between compartment and pool, /modelpath/Compartment/Group/pool 
-  	//  2. for 'enzComplx' in case of ExpilcityEnz its would be, /modelpath/Compartment/Group/Pool/Enz/Pool_cplx 
-	// For these cases I have check, but there may be a subgroup may exist then this bit of code need to cleanup
-	// further down
-
-	ObjId pa = Field< ObjId >::get( itr, "parent" );
-	string parentclass = Field<string> :: get (pa,"className");
-	string groupName;
-	if (parentclass == "CubeMesh" or parentclass == "CyclMesh")
-		return (groupName);
-	else if (parentclass == "Neutral")
-		groupName = Field<string> :: get(pa,"name");	  
-  	else if ((parentclass == "Enz") or (parentclass == "ZombieEnz"))
-  	{ 
-  	 	ObjId ppa = Field< ObjId >::get( pa, "parent" );
-	   	string parentParentclass = Field<string> :: get (ppa,"className");
-	   	if (parentParentclass == "Neutral")
-	   		groupName = Field<string> :: get(ppa,"name");
-	   	
-	   	else if ( parentParentclass == "Pool" or parentParentclass == "ZombiePool" or
-	     	          parentParentclass == "BufPool" or parentParentclass == "ZombieBufPool")
-	   	{
-	   		ObjId poolpa = Field< ObjId >::get( ppa, "parent" );
-	   		string pathtt = Field < string > :: get(poolpa,"path");
-	   		parentParentclass = Field<string> :: get (poolpa,"className");
-	   		if (parentParentclass == "Neutral")
-	   	  		groupName = Field<string> :: get(poolpa,"name");
-	   	  	  
-      	}
-    }//else if
-    else
-  	{  	ObjId ppa = Field< ObjId >::get( pa, "parent" );
-		string parentclass = Field<string> :: get (ppa,"className");
-		if (parentclass =="Neutral")
-			groupName = Field<string> :: get(ppa,"name");
-	} //else
-	return( groupName);
-}
-bool moose::SbmlWriter::writeModel( const SBMLDocument* sbmlDoc, const string& filename )
-{
-  SBMLWriter sbmlWriter;
-  // cout << "sbml writer" << filename << sbmlDoc << endl;
-  bool result = sbmlWriter.writeSBML( sbmlDoc, filename.c_str() );
-  if ( result )
-    {
-      cout << "Wrote file \"" << filename << "\"" << endl;
-      return true;
-    }
-  else
-    {
-      cerr << "Failed to write \"" << filename << "\"" << "  check for path and write permission" << endl;
-      return false;
-    }
-}
-//string moose::SbmlWriter::cleanNameId(vector < Id > const &itr,int index)
-
-string moose::SbmlWriter::cleanNameId(Id itrid,int  index)
-{ string objname = Field<string> :: get(itrid,"name");
-  string objclass = Field<string> :: get(itrid,"className");
-  ostringstream Objid;
-  Objid << (itrid) <<"_"<<index;
-  objname = nameString(objname);
-  string objname_id = objname + "_" + Objid.str() + "_";
-    //changeName(objname,Objid.str());
-  if (objclass == "MMenz")
-    { string objname_id_n = objname_id + "_MM_Reaction_";
-      //changeName(objname_id,"MM_Reaction" );
-      objname_id = objname_id_n;
-    }
-  else if (objclass == "Enz")
-    { string objname_id_n = objname_id + "Complex_formation_";
-	//changeName(objname_id,"Complex_formation" );
-      objname_id = objname_id_n;
-    }
-  string clean_nameid = idBeginWith(objname_id);
-  return clean_nameid ;
-}
-void moose::SbmlWriter::getSubPrd(Reaction* rec,string type,string enztype,Id itrRE, int index,ostringstream& rate_law,double &rct_order,bool w,string re_enClass)
-{
-  nameList_.clear();
-  SpeciesReference* spr;
-  ModifierSpeciesReference * mspr;
-  vector < Id > rct = LookupField <string,vector < Id> >::get(itrRE, "neighbors",type);
-  std::set < Id > rctprdUniq;
-  rctprdUniq.insert(rct.begin(),rct.end());
-  for (std::set < Id> :: iterator rRctPrd = rctprdUniq.begin();rRctPrd!=rctprdUniq.end();rRctPrd++)
-    { double stoch = count( rct.begin(),rct.end(),*rRctPrd );
-      string objname = Field<string> :: get(*rRctPrd,"name");
-      string cleanObjname = nameString(objname);
-      string clean_name = cleanNameId(*rRctPrd,index);
-      string objClass = Field<string> :: get(*rRctPrd,"className");
-
-      if (type == "sub" or (type == "enzOut" and enztype == "sub" ) or (type == "cplxDest" and enztype == "sub")) 
-	{ spr = rec->createReactant();
-	  spr->setSpecies(clean_name);
-	  spr->setStoichiometry( stoch );
-	  
-	  if (objClass == "BufPool")
-	    spr->setConstant( true );
-	  else
-	    spr->setConstant( false);
-	  
-	}
-      else if(type == "prd" or (type == "enzOut" and enztype == "prd" ) or (type == "cplxDest" and enztype == "prd"))
-	{
-	  spr = rec->createProduct();
-	  spr->setSpecies( clean_name );
-	  spr->setStoichiometry( stoch );
-	  if (objClass == "BufPool")
-	    spr->setConstant( true );
-	  else
-	    spr->setConstant( false);
-	}
-      else if(type == "enzDest")
-	{
-	  mspr = rec->createModifier();
-	  mspr->setSpecies(clean_name);
-	}
-      /* Updating list of object for annotation for Enzymatic reaction */
-	   if (re_enClass =="Enz" or re_enClass == "ZombieEnz")
-		nameList_.push_back	(clean_name);
-
-      /* Rate law is also updated in rate_law string */
-      //std::size_t found = clean_name.find("cplx");
-      //cout << " stoch" << stoch << " c name " << clean_name;
-      if (w)
-	{
-	  rct_order += stoch;
-	  if ( stoch == 1 )
-	    rate_law << "*" << clean_name;
-	  else
-	    rate_law << "*" <<clean_name << "^" << stoch;
-	}
-    } //rRct
-  //return rctprdUniq ;
-}
-
-void moose::SbmlWriter::getModifier(ModifierSpeciesReference* mspr,vector < Id> mod, int index,ostringstream& rate_law,double &rct_order,bool w)
-{ 
-  std::set < Id > modifierUniq;
-  modifierUniq.insert(mod.begin(),mod.end());
-  for (std::set < Id> :: iterator rmod = modifierUniq.begin();rmod!=modifierUniq.end();rmod++)
-    { double stoch = count( mod.begin(),mod.end(),*rmod );
-      string clean_name = cleanNameId(*rmod,index);
-      mspr->setSpecies( clean_name );
-      /* Rate law is also updated in rate_law string */
-      if (w)
-	{
-	  rct_order += stoch;
-	  if ( stoch == 1 )
-	    rate_law << "*" << clean_name;
-	  else
-	    rate_law << "*" <<clean_name << "^" << stoch;
-	}
-    } //rRct
-  //return modUniq ;
-}
-/* *  removes special characters  **/
-
-string moose::SbmlWriter::nameString1( string str )
-{ string str1;
-  int len = str.length();
-  int i= 0;
-  do
-    {
-      switch( str.at(i) )
-	{
-	  case '&':
-	    str1 = "_and_";
-	    str.replace( i,1,str1 );
-	    len += str1.length()-1;
-	    break;
-	  case '<':
-	    str1 = "_lessthan_";
-	    str.replace( i,1,str1 );
-	    len += str1.length()-1;
-	    break; 
-	case '>':
-	    str1 = "_greaterthan_";
-	    str.replace( i,1,str1 );
-	    len += str1.length()-1;
-	    break; 
-	case '':
-	    str1 = "&#176;";
-	    str.replace( i,1,str1 );
-	    len += str1.length()-1;
-	    break; 
-	}
-    i++;
-    }while ( i < len );
-  return str;
-
-}
-string moose::SbmlWriter::nameString( string str )
-{ string str1;
-
-  int len = str.length();
-  int i= 0;
-  do
-    {
-      switch( str.at(i) )
-	{
-	case '-':
-	  str1 = "_dash_";
-	  str.replace( i,1,str1 );
-	  len += str1.length()-1;
-	  break;
-	case '\'':
-	  str1 = "_prime_";
-	  str.replace( i,1,str1 );
-	  len += str1.length()-1;
-	  break;
-
-	case '+':
-	  str1 = "_plus_";
-	  str.replace( i,1,str1 );
-	  len += str1.length()-1;
-	  break;
-	case '&':
-	  str1 = "_and_";
-	  str.replace( i,1,str1 );
-	  len += str1.length()-1;
-	  break;
-	case '*':
-	  str1 = "_star_";
-	  str.replace( i,1,str1 );
-	  len += str1.length()-1;
-	  break;
-	case '/':
-	  str1 = "_slash_";
-	  str.replace( i,1,str1 );
-	  len += str1.length()-1;
-	  break;
-	case '(':
-	  str1 = "_bo_";
-	  str.replace( i,1,str1 );
-	  len += str1.length()-1;
-	  break;
-	case ')':
-	  str1 = "_bc_";
-	  str.replace( i,1,str1 );
-	  len += str1.length()-1;
-	  break;
-	case '[':
-	  str1 = "_sbo_";
-	  str.replace( i,1,str1 );
-	  len += str1.length()-1;
-	  break;
-	case ']':
-	  str1 = "_sbc_";
-	  str.replace( i,1,str1 );
-	  len += str1.length()-1;
-	  break;
-	case '.':
-	  str1 = "_dot_";
-	  str.replace( i,1,str1 );
-	  len += str1.length()-1;
-	  break;
-	case ' ':
-	  str1 = "_";
-	  str.replace( i,1,str1 );
-	  len += str1.length()-1;
-	  break;  
-	}//end switch 
-      i++;
-    }while ( i < len );
-  return str;
-}
-/*   id preceeded with its parent Name   */
-string moose::SbmlWriter::changeName( string parent, string child )
-{string newName = parent + "_" + child + "_";
- return newName;
-}
-/* *  change id  if it starts with  a numeric  */
-string moose::SbmlWriter::idBeginWith( string name )
-{
-  string changedName = name;
-  if ( isdigit(name.at(0)) )
-    changedName = "_" + name;
-  return changedName;
-}
-string moose::SbmlWriter::parmUnit( double rct_order )
-{
-  ostringstream unit_stream;
-  int order = ( int) rct_order;
-  switch( order )
-    {
-      case 0:
-	unit_stream<<"per_second";
-	break;
-      case 1:
-	unit_stream<<"per_item_per_second";
-	break;
-      case 2:
-	unit_stream<<"per_item_sq_per_second";
-	break;
-      default:
-	unit_stream<<"per_item_"<<rct_order<<"_per_second";
-	break;
-    }
-  ListOfUnitDefinitions * lud =cremodel_->getListOfUnitDefinitions();
-  bool flag = false;
-  for ( unsigned int i=0;i<lud->size();i++ )
-    {
-      UnitDefinition * ud = lud->get(i);
-      if ( ud->getId() == unit_stream.str() ){
-	flag = true;
-	break;
-	}
-    }
-  if ( !flag ){
-    UnitDefinition* unitdef;
-    Unit* unit;
-    //cout << "order:" << order << endl;
-    unitdef = cremodel_->createUnitDefinition();
-    unitdef->setId( unit_stream.str() );
-    
-    // Create individual unit objects that will be put inside the UnitDefinition .
-    if (order != 0)
-      { //cout << "order is != 0: " << order << endl;
-	unit = unitdef->createUnit();
-	unit->setKind( UNIT_KIND_ITEM );
-	unit->setExponent( -order );
-	unit->setMultiplier(1);
-	unit->setScale( 0 );
-      }
-
-    unit = unitdef->createUnit();
-    unit->setKind( UNIT_KIND_SECOND );
-    unit->setExponent( -1 );
-    unit->setMultiplier( 1 );
-    unit->setScale ( 0 );
-  }
-  return unit_stream.str();
-}
-void moose::SbmlWriter::printParameters( KineticLaw* kl,string k,double kvalue,string unit )
-{
-  Parameter* para = kl->createParameter();
-  para->setId( k );
-  para->setValue( kvalue );
-  para->setUnits( unit );
-}
-
-string moose::SbmlWriter::findNotes(Id itr)
-{ string path = Field<string> :: get(itr,"path");
-  Id annotaId( path+"/info");
-  string noteClass = Field<string> :: get(annotaId,"className");
-  string notes;
-  if (noteClass =="Annotator")
-    string notes = Field <string> :: get(annotaId,"notes");
-
-  return notes;
-}
-
-/* *  validate a model before writing */
-
-bool moose::SbmlWriter::validateModel( SBMLDocument* sbmlDoc )
-{
-  if ( !sbmlDoc )
-    {cerr << "validateModel: given a null SBML Document" << endl;
-      return false;
-    }
-
-  string consistencyMessages;
-  string validationMessages;
-  bool noProblems                     = true;
-  unsigned int numCheckFailures       = 0;
-  unsigned int numConsistencyErrors   = 0;
-  unsigned int numConsistencyWarnings = 0;
-  unsigned int numValidationErrors    = 0;
-  unsigned int numValidationWarnings  = 0;
-  // Once the whole model is done and before it gets written out, 
-  // it's important to check that the whole model is in fact complete, consistent and valid.
-  numCheckFailures = sbmlDoc->checkInternalConsistency();
-  if ( numCheckFailures > 0 )
-    {
-      noProblems = false;
-      for ( unsigned int i = 0; i < numCheckFailures; i++ )
-	{
-	  const SBMLError* sbmlErr = sbmlDoc->getError(i);
-	  if ( sbmlErr->isFatal() || sbmlErr->isError() )
-	    {
-	      ++numConsistencyErrors;
-	    }
-	  else
-	    {
-	      ++numConsistencyWarnings;
-	    }
-	  } 
-      ostringstream oss;
-      sbmlDoc->printErrors(oss);
-      consistencyMessages = oss.str();
-      }
-  // If the internal checks fail, it makes little sense to attempt
-  // further validation, because the model may be too compromised to
-  // be properly interpreted.
-  if ( numConsistencyErrors > 0 )
-    {
-      consistencyMessages += "Further validation aborted.";
-     }
-  else
-    {
-      numCheckFailures = sbmlDoc->checkConsistency();
-      if ( numCheckFailures > 0 )
-	{
-	  noProblems = false;
-	  for ( unsigned int i = 0; i < numCheckFailures; i++ )
-	    {
-	      const SBMLError* sbmlErr = sbmlDoc->getError(i);
-	      if ( sbmlErr->isFatal() || sbmlErr->isError() )
-		{
-		  ++numValidationErrors;
-		}
-	      else
-		{
-		  ++numValidationWarnings;
-		}      
-	      }
-	  ostringstream oss;
-	  sbmlDoc->printErrors( oss );
-	  validationMessages = oss.str();
-	  }
-	  }
-  if ( noProblems )
-    return true;
-  else
-    {
-      if ( numConsistencyErrors > 0 )
-	{
-	  cout << "ERROR: encountered " << numConsistencyErrors
-	       << " consistency error" << ( numConsistencyErrors == 1 ? "" : "s" )
-	       << " in model '" << sbmlDoc->getModel()->getId() << "'." << endl;
-	  }
-      if ( numConsistencyWarnings > 0 )
-	{
-	  cout << "Notice: encountered " << numConsistencyWarnings
-	       << " consistency warning" << ( numConsistencyWarnings == 1 ? "" : "s" )
-	       << " in model '" << sbmlDoc->getModel()->getId() << "'." << endl;
-	  }
-      cout << endl << consistencyMessages;
-      if ( numValidationErrors > 0 )
-	{
-	  cout << "ERROR: encountered " << numValidationErrors
-	       << " validation error" << ( numValidationErrors == 1 ? "" : "s" )
-	       << " in model '" << sbmlDoc->getModel()->getId() << "'." << endl;
-	}
-      if ( numValidationWarnings > 0 )
-	{
-	  cout << "Notice: encountered " << numValidationWarnings
-	       << " validation warning" << ( numValidationWarnings == 1 ? "" : "s" )
-	       << " in model '" << sbmlDoc->getModel()->getId() << "'." << endl;
-	  }
-      cout << endl << validationMessages;
-      return ( numConsistencyErrors == 0 && numValidationErrors == 0 );
-      }
-}
-#endif // USE_SBML
diff --git a/sbml/MooseSbmlWriter.h b/sbml/MooseSbmlWriter.h
deleted file mode 100644
index 779726ac..00000000
--- a/sbml/MooseSbmlWriter.h
+++ /dev/null
@@ -1,53 +0,0 @@
-/*******************************************************************
- * File:            MooseSbmlWriter.h
- * Description:      
- * Author:          HarsnaRani
- * E-mail:          hrani@ncbs.res.in
- ********************************************************************/
-/**********************************************************************
-** This program is part of 'MOOSE', the
-** Messaging Object Oriented Simulation Environment,
-** also known as GENESIS 3 base code.
-**           copyright (C) 2003-2015 Upinder S. Bhalla. and NCBS
-** It is made available under the terms of the
-** GNU Lesser General Public License version 2.1
-** See the file COPYING.LIB for the full notice.
-**********************************************************************/
-
-#ifndef _SBMLWRITER_H
-#define _SBMLWRITER_H
-#ifdef USE_SBML
-#include <sbml/SBMLTypes.h>
-namespace moose{
-class SbmlWriter
-{
-		
-	public:
-		SbmlWriter() {;}
-		~SbmlWriter() {;}
-		int write( string location, string filename );
-#ifdef USE_SBML
-		void createModel( string filename, SBMLDocument& doc ,string target);
-		bool validateModel(SBMLDocument* sbmlDoc );
-		bool writeModel( const SBMLDocument* sbmlDoc, const string& filename );
-		 
-	private:
-		vector < string >nameList_;
-		Model* cremodel_;	
-		string nameString( string str );
-		string nameString1( string str );
-		string changeName( string parent,string child );
-		string idBeginWith(string name );
-		string cleanNameId( Id itrid,int index);
-		string parmUnit( double rct_order );
-		void getSubPrd(Reaction* rec,string type,string enztype,Id itrRE, int index,ostringstream& rate_law,double &rct_order,bool w, string re_enClass);
-		void getModifier(ModifierSpeciesReference* mspr,vector < Id> mod, int index,ostringstream& rate_law,double &rct_order,bool w);
-		void printParameters( KineticLaw* kl,string k,double kvalue,string unit );
-		string findNotes(Id itr);
-		string getGroupinfo(Id itr);
-#endif
-};
-} // namespace moose
-//extern const Cinfo* initCinfo();
-#endif //USE_SBML
-#endif // _SBMLWRITER_H
diff --git a/scheduling/Clock.cpp b/scheduling/Clock.cpp
index a90cb7c9..433e52e0 100644
--- a/scheduling/Clock.cpp
+++ b/scheduling/Clock.cpp
@@ -354,6 +354,7 @@ const Cinfo* Clock::initCinfo()
         "	SimpleSynHandler		1		50e-6\n"
         "   STDPSynHandler		1		50e-6\n"
         "   GraupnerBrunel2012CaPlasticitySynHandler    1		50e-6\n"
+        "   SeqSynHandler		1		50e-6\n"
         "	CaConc				1		50e-6\n"
         "	CaConcBase			1		50e-6\n"
         "	DifShell			1		50e-6\n"
@@ -714,7 +715,7 @@ void Clock::handleStep( const Eref& e, unsigned long numSteps )
     assert( activeTicks_.size() == activeTicksMap_.size() );
     nSteps_ += numSteps;
     runTime_ = nSteps_ * dt_;
-    for ( isRunning_ = true;
+    for ( isRunning_ = (activeTicks_.size() > 0 );
             isRunning_ && currentStep_ < nSteps_; currentStep_ += stride_ )
     {
         // Curr time is end of current step.
@@ -752,11 +753,12 @@ void Clock::handleStep( const Eref& e, unsigned long numSteps )
             }
         }
     }
+	if ( activeTicks_.size() == 0 )
+		currentTime_ = runTime_;
 
     info_.dt = dt_;
     isRunning_ = false;
     finished()->send( e );
-
 }
 
 /**
@@ -837,6 +839,7 @@ void Clock::buildDefaultTick()
     defaultTick_["SimpleSynHandler"] = 1;
     defaultTick_["STDPSynHandler"] = 1;
     defaultTick_["GraupnerBrunel2012CaPlasticitySynHandler"] = 1;
+    defaultTick_["SeqSynHandler"] = 1;
     defaultTick_["CaConc"] = 1;
     defaultTick_["CaConcBase"] = 1;
     defaultTick_["DifShell"] = 1;
diff --git a/shell/Makefile b/shell/Makefile
index e39d221b..54330143 100644
--- a/shell/Makefile
+++ b/shell/Makefile
@@ -26,7 +26,8 @@ HEADERS = \
 default: $(TARGET)
 
 $(OBJ)	: $(HEADERS)
-Shell.o:	Shell.h Neutral.h ../scheduling/Clock.h ../sbml/MooseSbmlWriter.h ../sbml/MooseSbmlReader.h
+#Shell.o:	Shell.h Neutral.h ../scheduling/Clock.h ../sbml/MooseSbmlWriter.h ../sbml/MooseSbmlReader.h
+Shell.o:	Shell.h Neutral.h ../scheduling/Clock.h
 ShellCopy.o:	Shell.h Neutral.h ../scheduling/Clock.h
 ShellSetGet.o:	Shell.h
 ShellThreads.o:	Shell.h Neutral.h ../scheduling/Clock.h
diff --git a/shell/Shell.cpp b/shell/Shell.cpp
index 85aca3e4..8ba30eb5 100644
--- a/shell/Shell.cpp
+++ b/shell/Shell.cpp
@@ -28,11 +28,11 @@ using namespace std;
 // Want to separate out this search path into the Makefile options
 #include "../scheduling/Clock.h"
 
-#ifdef USE_SBML
+/*#ifdef USE_SBML
 #include "../sbml/MooseSbmlWriter.h"
 #include "../sbml/MooseSbmlReader.h"
 #endif
-
+*/
 const unsigned int Shell::OkStatus = ~0;
 const unsigned int Shell::ErrorStatus = ~1;
 
@@ -419,6 +419,7 @@ void Shell::doUseClock( string path, string field, unsigned int tick )
 /**
  * Write given model to SBML file. Returns success value.
  */
+ /*
 int Shell::doWriteSBML( const string& fname, const string& modelpath )
 {
 #ifdef USE_SBML
@@ -431,10 +432,11 @@ int Shell::doWriteSBML( const string& fname, const string& modelpath )
     return -2;
 #endif
 }
+*/
 /**
  * read given SBML model to moose. Returns success value.
  */
-
+/*
 Id Shell::doReadSBML( const string& fname, const string& modelpath, const string& solverclass )
 {
 #ifdef USE_SBML
@@ -445,7 +447,7 @@ Id Shell::doReadSBML( const string& fname, const string& modelpath, const string
     return Id();
 #endif
 }
-
+*/
 
 ////////////////////////////////////////////////////////////////////////
 
diff --git a/shell/Shell.h b/shell/Shell.h
index 4f7ae9f9..f3410f2f 100644
--- a/shell/Shell.h
+++ b/shell/Shell.h
@@ -246,9 +246,9 @@ class Shell
 		/**
 		 * Write given model to SBML file. Returns success value.
 		 */
-		 int doWriteSBML( const string& fname, const string& modelpath );
-		 Id doReadSBML( const string& fname, const string& modelpath, const string& solverclass=""
-		 );
+		 //int doWriteSBML( const string& fname, const string& modelpath );
+		 //Id doReadSBML( const string& fname, const string& modelpath, const string& solverclass=""
+		 //);
 		 
 		/**
  		 * This function synchronizes fieldDimension on the DataHandler 
diff --git a/synapse/CMakeLists.txt b/synapse/CMakeLists.txt
index 76d1e101..0a911670 100644
--- a/synapse/CMakeLists.txt
+++ b/synapse/CMakeLists.txt
@@ -1,11 +1,14 @@
 cmake_minimum_required(VERSION 2.6)
 include_directories(../basecode ../utility ../kinetics)
+include_directories( ../external/muparser/include/ )
 add_library(synapse
-    SimpleSynHandler.cpp
     SynHandlerBase.cpp
+    SimpleSynHandler.cpp
     STDPSynHandler.cpp
     GraupnerBrunel2012CaPlasticitySynHandler.cpp
     Synapse.cpp
     STDPSynapse.cpp
+    RollingMatrix.cpp 
+    SeqSynHandler.cpp
     testSynapse.cpp
     )
diff --git a/synapse/GraupnerBrunel2012CaPlasticitySynHandler.cpp b/synapse/GraupnerBrunel2012CaPlasticitySynHandler.cpp
index c4d3e2a2..29e340a5 100644
--- a/synapse/GraupnerBrunel2012CaPlasticitySynHandler.cpp
+++ b/synapse/GraupnerBrunel2012CaPlasticitySynHandler.cpp
@@ -10,8 +10,8 @@
 #include <queue>
 #include "header.h"
 #include "Synapse.h"
+#include "SynEvent.h" // only using the SynEvent class from this
 #include "SynHandlerBase.h"
-#include "SimpleSynHandler.h" // only using the SynEvent class from this
 #include "../randnum/Normal.h" // generate normal randum numbers for noisy weight update
 #include "GraupnerBrunel2012CaPlasticitySynHandler.h"
 
diff --git a/synapse/GraupnerBrunel2012CaPlasticitySynHandler.h b/synapse/GraupnerBrunel2012CaPlasticitySynHandler.h
index c0be336b..8ea9af06 100644
--- a/synapse/GraupnerBrunel2012CaPlasticitySynHandler.h
+++ b/synapse/GraupnerBrunel2012CaPlasticitySynHandler.h
@@ -10,6 +10,7 @@
 #ifndef _GRAUPNER_BRUNEL_2012_CA_PLASTICITY_SYN_HANDLER_H
 #define _GRAUPNER_BRUNEL_2012_CA_PLASTICITY_SYN_HANDLER_H
 
+/*
 class PreSynEvent: public SynEvent
 {
 	public:
@@ -50,6 +51,7 @@ struct ComparePostSynEvent
 		return lhs.time > rhs.time;
 	}
 };
+*/
 
 // see pg 13 of Higgins et al | October 2014 | Volume 10 | Issue 10 | e1003834 | PLOS Comp Biol
 // tP and tD are times spent above potentiation and depression thresholds
diff --git a/synapse/Makefile b/synapse/Makefile
index dc1860fe..bbf82b99 100644
--- a/synapse/Makefile
+++ b/synapse/Makefile
@@ -16,6 +16,8 @@ OBJ = \
 	GraupnerBrunel2012CaPlasticitySynHandler.o	\
 	Synapse.o	\
 	STDPSynapse.o	\
+	RollingMatrix.o	\
+	SeqSynHandler.o \
 	testSynapse.o	\
 
 # GSL_LIBS = -L/usr/lib -lgsl
@@ -29,15 +31,17 @@ default: $(TARGET)
 
 $(OBJ)	: $(HEADERS)
 SynHandlerBase.o:	SynHandlerBase.h Synapse.h 
-SimpleSynHandler.o:	SynHandlerBase.h Synapse.h SimpleSynHandler.h
-STDPSynHandler.o:	SynHandlerBase.h STDPSynapse.h STDPSynHandler.h
-GraupnerBrunel2012CaPlasticitySynHandler.o:	SynHandlerBase.h Synapse.h GraupnerBrunel2012CaPlasticitySynHandler.h
+SimpleSynHandler.o:	SynHandlerBase.h Synapse.h SimpleSynHandler.h SynEvent.h
+STDPSynHandler.o:	SynHandlerBase.h STDPSynapse.h STDPSynHandler.h SynEvent.h
+GraupnerBrunel2012CaPlasticitySynHandler.o:	SynHandlerBase.h Synapse.h GraupnerBrunel2012CaPlasticitySynHandler.h SynEvent.h
 Synapse.o:	Synapse.h SynHandlerBase.h
 STDPSynapse.o:	STDPSynapse.h SynHandlerBase.h
-testSynapse.o: SynHandlerBase.h Synapse.h SimpleSynHandler.h
+RollingMatrix.o:	RollingMatrix.h
+SeqSynHandler.o:	SynHandlerBase.h Synapse.h SimpleSynHandler.h SeqSynHandler.h RollingMatrix.h SynEvent.h
+testSynapse.o: SynHandlerBase.h Synapse.h SimpleSynHandler.h SeqSynHandler.h RollingMatrix.h SynEvent.h
 
 .cpp.o:
-	$(CXX) $(CXXFLAGS) $(SMOLDYN_FLAGS) -I. -I../basecode -I../msg $< -c
+	$(CXX) $(CXXFLAGS) $(SMOLDYN_FLAGS) -I. -I../basecode -I../msg -I .. -I../external/muparser/include $< -c
 
 $(TARGET): $(OBJ) $(SMOLDYN_OBJ) $(HEADERS)
 	$(LD) -r -o $(TARGET) $(OBJ) $(SMOLDYN_OBJ) $(SMOLDYN_LIB_PATH) $(SMOLDYN_LIBS) $(GSL_LIBS)
diff --git a/synapse/RollingMatrix.cpp b/synapse/RollingMatrix.cpp
new file mode 100644
index 00000000..5fd84a9b
--- /dev/null
+++ b/synapse/RollingMatrix.cpp
@@ -0,0 +1,109 @@
+/**********************************************************************
+** This program is part of 'MOOSE', the
+** Messaging Object Oriented Simulation Environment.
+**           Copyright (C) 2016 Upinder S. Bhalla. and NCBS
+** It is made available under the terms of the
+** GNU Lesser General Public License version 2.1
+** See the file COPYING.LIB for the full notice.
+**********************************************************************/
+
+#include <vector>
+using namespace std;
+#include "RollingMatrix.h"
+#include <cassert>
+
+RollingMatrix::RollingMatrix()
+		: nrows_(0), ncolumns_(0), currentStartRow_(0)
+{;}
+
+
+RollingMatrix::~RollingMatrix()
+{;}
+
+RollingMatrix& RollingMatrix::operator=( const RollingMatrix& other )
+{
+	nrows_ = other.nrows_;
+	ncolumns_ = other.ncolumns_;
+	currentStartRow_ = other.currentStartRow_;
+	rows_ = other.rows_;
+	return *this;
+}
+
+
+void RollingMatrix::resize( unsigned int nrows, unsigned int ncolumns )
+{
+	rows_.resize( nrows );
+	nrows_ = nrows;
+	ncolumns_ = ncolumns;
+	for ( unsigned int i = 0; i < nrows; ++i ) {
+		rows_[i].resize( ncolumns, 0.0 );
+	}
+	currentStartRow_ = 0;
+}
+
+double RollingMatrix::get( unsigned int row, unsigned int column ) const
+{
+	unsigned int index = (row + currentStartRow_ ) % nrows_;
+	return rows_[index][column];
+}
+
+void RollingMatrix::sumIntoEntry( double input, unsigned int row, unsigned int column )
+{
+	unsigned int index = (row + currentStartRow_ ) % nrows_;
+	SparseVector& sv = rows_[index];
+	sv[column] += input;
+}
+
+void RollingMatrix::sumIntoRow( const vector< double >& input, unsigned int row )
+{
+	unsigned int index = (row + currentStartRow_) % nrows_;
+	SparseVector& sv = rows_[index];
+
+	for (unsigned int i = 0; i < input.size(); ++i )
+		sv[i] += input[i];
+}
+
+
+double RollingMatrix::dotProduct( const vector< double >& input, 
+				unsigned int row, unsigned int startColumn ) const
+{
+	unsigned int index = (row + currentStartRow_) % nrows_;
+	const SparseVector& sv = rows_[index];
+
+	double ret = 0;
+	if ( input.size() + startColumn <= sv.size() ) {
+		for (unsigned int i = 0; i < input.size(); ++i )
+			ret += sv[i + startColumn] * input[i];
+	} else if ( sv.size() > startColumn ) {
+		unsigned int end = sv.size() - startColumn;
+		for (unsigned int i = 0; i < end; ++i )
+			ret += sv[i + startColumn] * input[i];
+	}
+	return ret;
+}
+
+void RollingMatrix::correl( vector< double >& ret, 
+				const vector< double >& input, unsigned int row) const
+
+{
+	if ( ret.size() < ncolumns_ )
+		ret.resize( ncolumns_, 0.0 );
+	for ( unsigned int i = 0; i < ncolumns_; ++i ) {
+		ret[i] += dotProduct( input, row, i );
+	}
+}
+
+void RollingMatrix::zeroOutRow( unsigned int row )
+{
+	unsigned int index = (row + currentStartRow_) % nrows_;
+	rows_[index].assign( rows_[index].size(), 0.0 );
+}
+
+void RollingMatrix::rollToNextRow()
+{
+	if ( currentStartRow_ == 0 )
+		currentStartRow_ = nrows_ - 1;
+	else 
+		currentStartRow_--;
+	zeroOutRow( 0 );
+}
diff --git a/synapse/RollingMatrix.h b/synapse/RollingMatrix.h
new file mode 100644
index 00000000..104636a8
--- /dev/null
+++ b/synapse/RollingMatrix.h
@@ -0,0 +1,61 @@
+/**********************************************************************
+** This program is part of 'MOOSE', the
+** Messaging Object Oriented Simulation Environment.
+**           Copyright (C) 2016 Upinder S. Bhalla. and NCBS
+** It is made available under the terms of the
+** GNU Lesser General Public License version 2.1
+** See the file COPYING.LIB for the full notice.
+**********************************************************************/
+
+#ifndef _ROLLING_MATRIX_H
+#define _ROLLING_MATRIX_H
+
+// Temporary, just to get going.
+typedef vector< double > SparseVector;
+
+class RollingMatrix {
+	public: 
+		// Specify empty matrix.
+		RollingMatrix();
+		~RollingMatrix();
+		RollingMatrix& operator=( const RollingMatrix& other );
+
+		// Specify size of matrix. Allocations may happen later.
+		void resize( unsigned int numRows, unsigned int numColumns );
+
+		// Return specified entry.
+		double get( unsigned int row, unsigned int column ) const;
+
+		// Sum contents of input into entry at specfied row, column.
+		// Row index is relative to current zero.
+		void sumIntoEntry( double input, unsigned int row, unsigned int column );
+
+		// Sum contents of input into vector at specfied row.
+		// Row index is relative to current zero.
+		void sumIntoRow( const vector< double >& input, unsigned int row );
+
+		// Return dot product of input with internal vector at specified 
+		// row, starting at specified column.
+		double dotProduct( const vector< double >& input, unsigned int row,
+					   	unsigned int startColumn ) const;
+
+		// Return correlation found by summing dotProduct across all columns
+		void correl( vector< double >& ret, const vector< double >& input, 
+						unsigned int row ) const;
+
+		// Zero out contents of row.
+		void zeroOutRow( unsigned int row );
+
+		// Roll the matrix by one row. What was row 0 becomes row 1, etc.
+		// Last row vanishes.
+		void rollToNextRow(); // 
+
+	private:
+		unsigned int nrows_;
+		unsigned int ncolumns_;
+		unsigned int currentStartRow_;
+
+		vector< SparseVector > rows_;
+};
+
+#endif // _ROLLING_MATRIX
diff --git a/synapse/STDPSynHandler.cpp b/synapse/STDPSynHandler.cpp
index e3e20c17..568a89de 100644
--- a/synapse/STDPSynHandler.cpp
+++ b/synapse/STDPSynHandler.cpp
@@ -10,9 +10,9 @@
 #include <queue>
 #include "header.h"
 #include "Synapse.h"
+#include "SynEvent.h" // only using the SynEvent class from this
 #include "SynHandlerBase.h"
 #include "STDPSynapse.h"
-#include "SimpleSynHandler.h" // only using the SynEvent class from this
 #include "STDPSynHandler.h"
 
 const Cinfo* STDPSynHandler::initCinfo()
diff --git a/synapse/STDPSynHandler.h b/synapse/STDPSynHandler.h
index 704b0b9f..e5fc9f45 100644
--- a/synapse/STDPSynHandler.h
+++ b/synapse/STDPSynHandler.h
@@ -10,6 +10,7 @@
 #ifndef _STDP_SYN_HANDLER_H
 #define _STDP_SYN_HANDLER_H
 
+/*
 class PreSynEvent: public SynEvent
 {
 	public:
@@ -50,6 +51,7 @@ struct ComparePostSynEvent
 		return lhs.time > rhs.time;
 	}
 };
+*/
 
 /**
  * This handles simple synapses without plasticity. It uses a priority
diff --git a/synapse/SeqSynHandler.cpp b/synapse/SeqSynHandler.cpp
new file mode 100644
index 00000000..833b3402
--- /dev/null
+++ b/synapse/SeqSynHandler.cpp
@@ -0,0 +1,437 @@
+/**********************************************************************
+** This program is part of 'MOOSE', the
+** Messaging Object Oriented Simulation Environment.
+**           Copyright (C) 2016 Upinder S. Bhalla. and NCBS
+** It is made available under the terms of the
+** GNU Lesser General Public License version 2.1
+** See the file COPYING.LIB for the full notice.
+**********************************************************************/
+
+#include <queue>
+#include "header.h"
+#include "Synapse.h"
+#include "SynEvent.h"
+#include "SynHandlerBase.h"
+#include "RollingMatrix.h"
+#include "SeqSynHandler.h"
+#include "muParser.h"
+
+const Cinfo* SeqSynHandler::initCinfo()
+{
+	static string doc[] = 
+	{
+		"Name", "SeqSynHandler",
+		"Author", "Upi Bhalla",
+		"Description", 
+		"The SeqSynHandler handles synapses that recognize sequentially "
+			"ordered input, where the ordering is both in space and time. "
+			"It assumes that the N input synapses are ordered and "
+			"equally spaced along a single linear vector.\n "
+			"To do this it maintains a record of recent synaptic input, "
+			"for a duration of *historyTime*, at a time interval *seqDt*. "
+			"*SeqDt* is typically longer than the simulation "
+			"timestep *dt* for the synapse, and cannot be shorter. "
+			"*SeqDt* should represent the characteristic time of advance "
+			"of the sequence. \n"
+			"The SeqSynHandler uses a 2-D kernel to define how to recognize"
+			" a sequence, with dependence both on space and history. "
+			"This kernel is defined by the *kernelEquation* as a "
+			"mathematical expression in x (synapse number) and t (time)."
+			"It computes a vector with the local *response* term for each "
+			"point along all inputs, by taking a 2-d convolution of the "
+		    "kernel with the history[time][synapse#] matrix."
+			"\nThe local response can affect the synapse in three ways: " 
+			"1. It can sum the entire response vector, scale by the "
+			"*responseScale* term, and send to the synapse as a steady "
+			"activation. Consider this a cell-wide immediate response to "
+			"a sequence that it likes.\n"
+			"2. It do an instantaneous scaling of the weight of each "
+			"individual synapse by the corresponding entry in the response "
+			"vector. It uses the *weightScale* term to do this. Consider "
+			"this a short-term plasticity effect on specific synapses. \n"
+			"3. It can do long-term plasticity of each individual synapse "
+			"using the matched local entries in the response vector and "
+			"individual synapse history as inputs to the learning rule. "
+			"This is not yet implemented.\n"
+	};
+
+	static FieldElementFinfo< SynHandlerBase, Synapse > synFinfo( 
+		"synapse",
+		"Sets up field Elements for synapse",
+		Synapse::initCinfo(),
+		&SynHandlerBase::getSynapse,
+		&SynHandlerBase::setNumSynapses,
+		&SynHandlerBase::getNumSynapses
+	);
+
+	static ValueFinfo< SeqSynHandler, string > kernelEquation(
+			"kernelEquation",
+			"Equation in x and t to define kernel for sequence recognition",
+			&SeqSynHandler::setKernelEquation,
+			&SeqSynHandler::getKernelEquation
+	);
+	static ValueFinfo< SeqSynHandler, unsigned int > kernelWidth(
+			"kernelWidth",
+			"Width of kernel, i.e., number of synapses taking part in seq.",
+			&SeqSynHandler::setKernelWidth,
+			&SeqSynHandler::getKernelWidth
+	);
+	static ValueFinfo< SeqSynHandler, double > seqDt(
+			"seqDt",
+			"Characteristic time for advancing the sequence.",
+			&SeqSynHandler::setSeqDt,
+			&SeqSynHandler::getSeqDt
+	);
+	static ValueFinfo< SeqSynHandler, double > historyTime(
+			"historyTime",
+			"Duration to keep track of history of inputs to all synapses.",
+			&SeqSynHandler::setHistoryTime,
+			&SeqSynHandler::getHistoryTime
+	);
+	static ValueFinfo< SeqSynHandler, double > responseScale(
+			"responseScale",
+			"Scaling factor for sustained activation of synapse by seq",
+			&SeqSynHandler::setResponseScale,
+			&SeqSynHandler::getResponseScale
+	);
+	static ReadOnlyValueFinfo< SeqSynHandler, double > seqActivation(
+			"seqActivation",
+			"Reports summed activation of synaptic channel by sequence",
+			&SeqSynHandler::getSeqActivation
+	);
+	static ValueFinfo< SeqSynHandler, double > weightScale(
+			"weightScale",
+			"Scaling factor for weight of each synapse by response vector",
+			&SeqSynHandler::setWeightScale,
+			&SeqSynHandler::getWeightScale
+	);
+	static ReadOnlyValueFinfo< SeqSynHandler, vector< double > > 
+			weightScaleVec(
+			"weightScaleVec",
+			"Vector of  weight scaling for each synapse",
+			&SeqSynHandler::getWeightScaleVec
+	);
+	static ReadOnlyValueFinfo< SeqSynHandler, vector< double > > kernel(
+			"kernel",
+			"All entries of kernel, as a linear vector",
+			&SeqSynHandler::getKernel
+	);
+	static ReadOnlyValueFinfo< SeqSynHandler, vector< double > > history(
+			"history",
+			"All entries of history, as a linear vector",
+			&SeqSynHandler::getHistory
+	);
+
+	static Finfo* seqSynHandlerFinfos[] = {
+		&synFinfo,					// FieldElement
+		&kernelEquation,			// Field
+		&kernelWidth,				// Field
+		&seqDt,						// Field
+		&historyTime,				// Field
+		&responseScale,				// Field
+		&seqActivation,				// Field
+		&weightScale,				// Field
+		&weightScaleVec,			// Field
+		&kernel,					// ReadOnlyField
+		&history					// ReadOnlyField
+	};
+
+	static Dinfo< SeqSynHandler > dinfo;
+	static Cinfo seqSynHandlerCinfo (
+		"SeqSynHandler",
+		SynHandlerBase::initCinfo(),
+		seqSynHandlerFinfos,
+		sizeof( seqSynHandlerFinfos ) / sizeof ( Finfo* ),
+		&dinfo,
+		doc,
+		sizeof( doc ) / sizeof( string )
+	);
+
+	return &seqSynHandlerCinfo;
+}
+
+static const Cinfo* seqSynHandlerCinfo = SeqSynHandler::initCinfo();
+
+//////////////////////////////////////////////////////////////////////
+
+SeqSynHandler::SeqSynHandler()
+	: 
+		kernelEquation_( "" ),
+		kernelWidth_( 5 ),
+		historyTime_( 2.0 ), 
+		seqDt_ ( 1.0 ), 
+		responseScale_( 1.0 ),
+		weightScale_( 0.0 ),
+		seqActivation_( 0.0 )
+{ 
+	int numHistory = static_cast< int >( 1.0 + floor( historyTime_ * (1.0 - 1e-6 ) / seqDt_ ) );
+	history_.resize( numHistory, 0 );
+}
+
+SeqSynHandler::~SeqSynHandler()
+{ ; }
+
+//////////////////////////////////////////////////////////////////////
+SeqSynHandler& SeqSynHandler::operator=( const SeqSynHandler& ssh)
+{
+	synapses_ = ssh.synapses_;
+	for ( vector< Synapse >::iterator 
+					i = synapses_.begin(); i != synapses_.end(); ++i )
+			i->setHandler( this );
+
+	// For no apparent reason, priority queues don't have a clear operation.
+	while( !events_.empty() )
+		events_.pop();
+
+	return *this;
+}
+
+void SeqSynHandler::vSetNumSynapses( const unsigned int v )
+{
+	unsigned int prevSize = synapses_.size();
+	synapses_.resize( v );
+	for ( unsigned int i = prevSize; i < v; ++i )
+		synapses_[i].setHandler( this );
+
+	int numHistory = static_cast< int >( 1.0 + floor( historyTime_ * (1.0 - 1e-6 ) / seqDt_ ) );
+	history_.resize( numHistory, v );
+	latestSpikes_.resize( v, 0.0 );
+	weightScaleVec_.resize( v, 0.0 );
+	updateKernel();
+}
+
+unsigned int SeqSynHandler::vGetNumSynapses() const
+{
+	return synapses_.size();
+}
+
+Synapse* SeqSynHandler::vGetSynapse( unsigned int i )
+{
+	static Synapse dummy;
+	if ( i < synapses_.size() )
+		return &synapses_[i];
+	cout << "Warning: SeqSynHandler::getSynapse: index: " << i <<
+		" is out of range: " << synapses_.size() << endl;
+	return &dummy;
+}
+
+//////////////////////////////////////////////////////////////////////
+void SeqSynHandler::updateKernel()
+{
+	if ( kernelEquation_ == "" || seqDt_ < 1e-9 || historyTime_ < 1e-9 )
+		return;
+	double x = 0;
+	double t = 0;
+	mu::Parser p;
+	p.DefineVar("x", &x); 
+	p.DefineVar("t", &t); 
+	p.DefineConst(_T("pi"), (mu::value_type)M_PI);
+	p.DefineConst(_T("e"), (mu::value_type)M_E);
+	p.SetExpr( kernelEquation_ );
+	kernel_.clear();
+	int numHistory = static_cast< int >( 1.0 + floor( historyTime_ * (1.0 - 1e-6 ) / seqDt_ ) );
+	kernel_.resize( numHistory );
+	for ( int i = 0; i < numHistory; ++i ) {
+		kernel_[i].resize( kernelWidth_ );
+		t = i * seqDt_;
+		for ( unsigned int j = 0; j < kernelWidth_; ++j ) {
+			x = j;
+			kernel_[i][j] = p.Eval();
+		}
+	}
+}
+
+//////////////////////////////////////////////////////////////////////
+void SeqSynHandler::setKernelEquation( string eq )
+{
+	kernelEquation_ = eq;
+	updateKernel();
+}
+
+string SeqSynHandler::getKernelEquation() const
+{
+	return kernelEquation_;
+}
+
+void SeqSynHandler::setKernelWidth( unsigned int v )
+{
+	kernelWidth_ = v;
+	updateKernel();
+}
+
+unsigned int SeqSynHandler::getKernelWidth() const
+{
+	return kernelWidth_;
+}
+
+void SeqSynHandler::setSeqDt( double v )
+{
+	seqDt_ = v;
+	updateKernel();
+	int numHistory = static_cast< int >( 1.0 + floor( historyTime_ * (1.0 - 1e-6 ) / seqDt_ ) );
+	history_.resize( numHistory, vGetNumSynapses() );
+}
+
+double SeqSynHandler::getSeqDt() const
+{
+	return seqDt_;
+}
+
+void SeqSynHandler::setHistoryTime( double v )
+{
+	historyTime_ = v;
+	int numHistory = static_cast< int >( 1.0 + floor( historyTime_ * (1.0 - 1e-6 ) / seqDt_ ) );
+	history_.resize( numHistory, vGetNumSynapses() );
+	updateKernel();
+}
+
+double SeqSynHandler::getHistoryTime() const
+{
+	return historyTime_;
+}
+
+void SeqSynHandler::setResponseScale( double v )
+{
+	responseScale_ = v;
+}
+
+double SeqSynHandler::getResponseScale() const
+{
+	return responseScale_;
+}
+
+double SeqSynHandler::getSeqActivation() const
+{
+	return seqActivation_;
+}
+
+double SeqSynHandler::getWeightScale() const
+{
+	return weightScale_;
+}
+
+vector< double >SeqSynHandler::getWeightScaleVec() const
+{
+	return weightScaleVec_;
+}
+
+void SeqSynHandler::setWeightScale( double v )
+{
+	weightScale_ = v;
+}
+
+vector< double > SeqSynHandler::getKernel() const
+{
+	int numHistory = static_cast< int >( 1.0 + floor( historyTime_ * (1.0 - 1e-6 ) / seqDt_ ) );
+	vector< double > ret;
+	for ( int i = 0; i < numHistory; ++i ) {
+		ret.insert( ret.end(), kernel_[i].begin(), kernel_[i].end() );
+	}
+	return ret;
+}
+
+vector< double > SeqSynHandler::getHistory() const
+{
+	int numHistory = static_cast< int >( 1.0 + floor( historyTime_ * (1.0 - 1e-6 ) / seqDt_ ) );
+	int numX = vGetNumSynapses();
+	vector< double > ret( numX * numHistory, 0.0 );
+	vector< double >::iterator k = ret.begin();
+	for ( int i = 0; i < numHistory; ++i ) {
+		for ( int j = 0; j < numX; ++j )
+			*k++ = history_.get( i, j );
+	}
+	return ret;
+}
+
+/////////////////////////////////////////////////////////////////////
+
+void SeqSynHandler::addSpike(unsigned int index, double time, double weight)
+{
+	assert( index < synapses_.size() );
+	events_.push( PreSynEvent( index, time, weight ) );
+	// Strictly speaking this isn't right. If we have a long time lag
+	// then the correct placement of the spike may be in another time
+	// slice. For now, to get it going for LIF neurons, this will do.
+	// Even in the general case we will probably have a very wide window
+	// for the latestSpikes slice.
+	latestSpikes_[index] += weight;
+}
+
+unsigned int SeqSynHandler::addSynapse()
+{
+	unsigned int newSynIndex = synapses_.size();
+	synapses_.resize( newSynIndex + 1 );
+	synapses_[newSynIndex].setHandler( this );
+	return newSynIndex;
+}
+
+void SeqSynHandler::dropSynapse( unsigned int msgLookup )
+{
+	assert( msgLookup < synapses_.size() );
+	synapses_[msgLookup].setWeight( -1.0 );
+}
+
+/////////////////////////////////////////////////////////////////////
+void SeqSynHandler::vProcess( const Eref& e, ProcPtr p ) 
+{
+	// Here we look at the correlations and do something with them.
+	int numHistory = static_cast< int >( 1.0 + floor( historyTime_ * (1.0 - 1e-6 ) / seqDt_ ) );
+
+	// Check if we need to do correlations at all.
+	if ( numHistory > 0 && kernel_.size() > 0 ) {
+		// Check if timestep rolls over a seqDt boundary
+		if ( static_cast< int >( p->currTime / seqDt_ ) > 
+				static_cast< int >( (p->currTime - p->dt) / seqDt_ ) ) {
+			history_.rollToNextRow();
+			history_.sumIntoRow( latestSpikes_, 0 );
+			latestSpikes_.assign( vGetNumSynapses(), 0.0 );
+	
+			// Build up the sum of correlations over time
+			vector< double > correlVec( vGetNumSynapses(), 0.0 );
+			for ( int i = 0; i < numHistory; ++i )
+				history_.correl( correlVec, kernel_[i], i );
+			if ( responseScale_ > 0.0 ) { // Sum all responses, send to chan
+				seqActivation_ = 0.0;
+				for ( vector< double >::iterator y = correlVec.begin(); 
+								y != correlVec.end(); ++y )
+					seqActivation_ += *y;
+	
+				// We'll use the seqActivation_ to send a special msg.
+				seqActivation_ *= responseScale_;
+			}
+			if ( weightScale_ > 0.0 ) { // Short term changes in individual wts
+				weightScaleVec_ = correlVec;
+				for ( vector< double >::iterator y=weightScaleVec_.begin(); 
+							y != weightScaleVec_.end(); ++y )
+					*y *= weightScale_;
+			}
+		}
+	}
+
+	// Here we go through the regular synapse activation calculations.
+	// We can't leave it to the base class vProcess, because we need
+	// to scale the weights individually in some cases.
+	double activation = seqActivation_; // Start with seq activation
+	if ( weightScale_ > 0.0 ) {
+		while( !events_.empty() && events_.top().time <= p->currTime ) {
+			activation += events_.top().weight * 
+					weightScaleVec_[ events_.top().synIndex ] / p->dt;
+			events_.pop();
+		}
+	} else {
+		while( !events_.empty() && events_.top().time <= p->currTime ) {
+			activation += events_.top().weight / p->dt;
+			events_.pop();
+		}
+	}
+	if ( activation != 0.0 )
+		SynHandlerBase::activationOut()->send( e, activation );
+}
+
+void SeqSynHandler::vReinit( const Eref& e, ProcPtr p ) 
+{
+	// For no apparent reason, priority queues don't have a clear operation.
+	while( !events_.empty() )
+		events_.pop();
+}
+
diff --git a/synapse/SeqSynHandler.h b/synapse/SeqSynHandler.h
new file mode 100644
index 00000000..55fd0118
--- /dev/null
+++ b/synapse/SeqSynHandler.h
@@ -0,0 +1,117 @@
+/**********************************************************************
+** This program is part of 'MOOSE', the
+** Messaging Object Oriented Simulation Environment.
+**           Copyright (C) 2016 Upinder S. Bhalla. and NCBS
+** It is made available under the terms of the
+** GNU Lesser General Public License version 2.1
+** See the file COPYING.LIB for the full notice.
+**********************************************************************/
+
+#ifndef _SEQ_SYN_HANDLER_H
+#define _SEQ_SYN_HANDLER_H
+
+/**
+ * This handles synapses organized sequentially. The parent class
+ * SimpleSynHandler deals with the mechanics of data arrival.
+ * Here the algorithm is
+ * 0. Assume all synaptic input comes on a linear dendrite.
+ * 1. Maintain a history of depth D for synaptic activity. May be simplest
+ * to do as event list (sparse matrix) rather than full matrix.
+ * 2. Maintain a kernel of how to weight time and space
+ * 3. Here we have various options
+ * 3.1 At each spike event, decide how to weight it based on history.
+ * 3.2 At each timestep, compute an effective Gk and send to activation.
+ * 	This will be moderately nasty to compute
+ */
+class SeqSynHandler: public SynHandlerBase
+{
+	public: 
+		SeqSynHandler();
+		~SeqSynHandler();
+		SeqSynHandler& operator=( const SeqSynHandler& other );
+
+		////////////////////////////////////////////////////////////////
+		// Over-ridden virtual functions
+		////////////////////////////////////////////////////////////////
+		void vSetNumSynapses( unsigned int num );
+		unsigned int vGetNumSynapses() const;
+		Synapse* vGetSynapse( unsigned int i );
+		void vProcess( const Eref& e, ProcPtr p );
+		void vReinit( const Eref& e, ProcPtr p );
+
+		////////////////////////////////////////////////////////////////
+		/// Adds a new synapse, returns its index.
+		unsigned int addSynapse();
+		void dropSynapse( unsigned int droppedSynNumber );
+		void addSpike( unsigned int index, double time, double weight );
+		////////////////////////////////////////////////////////////////
+		// New fields.
+		////////////////////////////////////////////////////////////////
+		void setKernelEquation( string eq );
+ 		string getKernelEquation() const;
+		void setKernelWidth( unsigned int v );
+ 		unsigned int getKernelWidth() const;
+		void setSeqDt( double v );
+ 		double getSeqDt() const;
+		void setHistoryTime( double v );
+ 		double getHistoryTime() const;
+		void setResponseScale( double v );
+ 		double getResponseScale() const;
+ 		double getSeqActivation() const; // summed activation of syn chan
+		void setWeightScale( double v );
+ 		double getWeightScale() const;
+ 		vector< double > getWeightScaleVec() const;
+ 		vector< double > getKernel() const;
+ 		vector< double > getHistory() const;
+
+		////////////////////////////////////////////////////////////////
+		static const Cinfo* initCinfo();
+	private:
+		void updateKernel();
+		/*
+		 * Here I would like to put in a sparse matrix. 
+		 * Each timestep is a row
+		 * Each column is a neuron
+		 * Each value is the weight, though I could also look this up.
+		 * I need to make it a circular buffer.
+		 * The 'addRow' function inserts the non-zero entries representing
+		 * 	neurons that are active on this timestep. As a circular buffer
+		 * 	this needs to do some allocation juggling.
+		 * Need a new function similar to computeRowRate, basically a
+		 * dot product of input vector with specified row.
+		 * Then run through all available places.
+		 */
+		string kernelEquation_;
+		unsigned int kernelWidth_; // Width in terms of number of synapses 
+
+		// Time to store history. KernelDt defines num of rows
+		double historyTime_;	
+		double seqDt_;	// Time step for successive entries in kernel
+		// Scaling factor for sustained activation of synapse from response
+		double responseScale_; 
+		// Scaling factor for weight changes in each synapse from response
+		double weightScale_;
+
+		///////////////////////////////////////////
+		// Some readonly fields
+		double seqActivation_; // global activation if sequence recognized
+
+		// Weight scaling based on individual synapse sequence tuning.
+		vector< double > weightScaleVec_; 
+		
+		///////////////////////////////////////////
+		// Tracks the spikes that came in recently, as input to correlation
+		// analysis for sequence recognition.
+		vector< double > latestSpikes_; 
+
+		///////////////////////////////////////////
+		vector< vector<  double > > kernel_;	//Kernel for seq selectivity
+		RollingMatrix history_;	// Rows = time; cols = synInputs
+
+		vector< Synapse > synapses_;
+		priority_queue< PreSynEvent, vector< PreSynEvent >, CompareSynEvent > events_;
+
+
+};
+
+#endif // _SEQ_SYN_HANDLER_H
diff --git a/synapse/SimpleSynHandler.cpp b/synapse/SimpleSynHandler.cpp
index 5581b9da..360769e2 100644
--- a/synapse/SimpleSynHandler.cpp
+++ b/synapse/SimpleSynHandler.cpp
@@ -10,6 +10,7 @@
 #include <queue>
 #include "header.h"
 #include "Synapse.h"
+#include "SynEvent.h"
 #include "SynHandlerBase.h"
 #include "SimpleSynHandler.h"
 
diff --git a/synapse/SimpleSynHandler.h b/synapse/SimpleSynHandler.h
index 14c94262..4029332b 100644
--- a/synapse/SimpleSynHandler.h
+++ b/synapse/SimpleSynHandler.h
@@ -10,6 +10,7 @@
 #ifndef _SIMPLE_SYN_HANDLER_H
 #define _SIMPLE_SYN_HANDLER_H
 
+/*
 class SynEvent
 {
 	public:
@@ -34,6 +35,7 @@ struct CompareSynEvent
 		return lhs.time > rhs.time;
 	}
 };
+*/
 
 /**
  * This handles simple synapses without plasticity. It uses a priority
diff --git a/synapse/SynEvent.h b/synapse/SynEvent.h
new file mode 100644
index 00000000..20e517f4
--- /dev/null
+++ b/synapse/SynEvent.h
@@ -0,0 +1,79 @@
+/**********************************************************************
+** This program is part of 'MOOSE', the
+** Messaging Object Oriented Simulation Environment.
+**           Copyright (C) 2013 Upinder S. Bhalla. and NCBS
+** It is made available under the terms of the
+** GNU Lesser General Public License version 2.1
+** See the file COPYING.LIB for the full notice.
+**********************************************************************/
+
+#ifndef _SYN_EVENT_H
+#define _SYN_EVENT_H
+
+class SynEvent
+{
+	public:
+		SynEvent()
+			: time( 0.0 ), weight( 0.0 )
+		{;}
+
+		SynEvent( double t, double w )
+			: time( t ), weight( w )
+		{;}
+
+		double time;
+		double weight;
+};
+
+struct CompareSynEvent
+{
+	bool operator()(const SynEvent& lhs, const SynEvent& rhs) const
+	{
+		// Note that this is backwards. We want the smallest timestamp
+		// on the top of the events priority_queue.
+		return lhs.time > rhs.time;
+	}
+};
+
+class PreSynEvent: public SynEvent
+{
+	public:
+		PreSynEvent()
+			: SynEvent(),   // call the parent constructor with default args
+                            // by default calls without args, so no need really
+              synIndex( 0 )
+		{}
+
+		PreSynEvent( unsigned int i, double t, double w )
+			: SynEvent(t,w),// call the parent constructor with given args
+              synIndex( i )
+		{;}
+
+        unsigned int synIndex;
+};
+
+class PostSynEvent
+{
+	public:
+		PostSynEvent()
+			: time( 0.0 )
+		{;}
+
+		PostSynEvent( double t )
+			: time( t )
+		{;}
+
+		double time;
+};
+
+struct ComparePostSynEvent
+{
+	bool operator()(const PostSynEvent& lhs, const PostSynEvent& rhs) const
+	{
+		// Note that this is backwards. We want the smallest timestamp
+		// on the top of the events priority_queue.
+		return lhs.time > rhs.time;
+	}
+};
+
+#endif // _SYN_EVENT_H
diff --git a/synapse/testSynapse.cpp b/synapse/testSynapse.cpp
index b46e2f99..2edd4165 100644
--- a/synapse/testSynapse.cpp
+++ b/synapse/testSynapse.cpp
@@ -12,15 +12,146 @@
 #include <queue>
 #include "header.h"
 #include "Synapse.h"
+#include "SynEvent.h"
 #include "SynHandlerBase.h"
 #include "SimpleSynHandler.h"
+#include "RollingMatrix.h"
+#include "SeqSynHandler.h"
 #include "../shell/Shell.h"
 #include "../randnum/randnum.h"
 
+void testRollingMatrix()
+{
+	int nr = 5;
+	int ncol = 10;
+	RollingMatrix rm;
+	rm.resize( 5, 10 );
+	
+	for ( int i = 0; i < nr; ++i ) {
+		rm.sumIntoEntry( i + 1, i, i );
+	}
+	for ( int i = 0; i < nr; ++i ) {
+		for ( int j = 0; j < ncol; ++j ) {
+			assert( rm.get( i, j ) == ( i == j ) * (i+1) );
+		}
+	}
+	cout << "." << flush;
+
+	// Old row0 becomes row1 and so on. Old row4 (now 0) should be cleared.
+	rm.rollToNextRow(); 
+	for ( int i = 0; i < nr; ++i ) {
+		for ( int j = 0; j < ncol; ++j ) {
+			// cout << rm.get( i, j );
+			assert( rm.get( i, j ) == ( i == j+1 ) * i );
+		}
+		// Here are the entries in the rm.rows_ matrix
+		// 000000000
+		// 100000000
+		// 020000000
+		// 003000000
+		// 000400000
+	}
+	cout << "." << flush;
+
+	vector< double > input( 10, 0.0 );
+	for ( int i = 0; i < nr; ++i )
+			input[i] = i + 1;
+
+	assert( doubleEq( rm.dotProduct( input, 0, 0 ), 0.0 ) );
+	assert( doubleEq( rm.dotProduct( input, 1, 0 ), 1.0 ) );
+	assert( doubleEq( rm.dotProduct( input, 2, 0 ), 4.0 ) );
+	assert( doubleEq( rm.dotProduct( input, 3, 0 ), 9.0 ) );
+	assert( doubleEq( rm.dotProduct( input, 4, 0 ), 16.0 ) );
+	assert( doubleEq( rm.dotProduct( input, 4, 1 ), 12.0 ) );
+	assert( doubleEq( rm.dotProduct( input, 4, 2 ), 8.0 ) );
+	assert( doubleEq( rm.dotProduct( input, 4, 3 ), 4.0 ) );
+	assert( doubleEq( rm.dotProduct( input, 4, 4 ), 0.0 ) );
+
+	rm.sumIntoRow( input, 0 );	// input == [1234500000]
+	vector< double > corr;
+	rm.correl( corr, input, 4 );	// rm[4] == [00040000]
+	assert( doubleEq( corr[0], 16.0 ) );
+	assert( doubleEq( corr[1], 12.0 ) );
+	assert( doubleEq( corr[2], 8.0 ) );
+	assert( doubleEq( corr[3], 4.0 ) );
+	assert( doubleEq( corr[4], 0.0 ) );
+
+	corr.assign( corr.size(), 0 );
+	rm.correl( corr, input, 0 );	// rm[0] == [1234500000]
+	assert( doubleEq( corr[0], 55.0 ) );
+	assert( doubleEq( corr[1], 40.0 ) );
+	assert( doubleEq( corr[2], 26.0 ) );
+	assert( doubleEq( corr[3], 14.0 ) );
+	assert( doubleEq( corr[4], 5.0 ) );
+
+	cout << "." << flush;
+}
+
+void testSeqSynapse()
+{
+	int numSyn = 10;
+	int kernelWidth = 5;
+	SeqSynHandler ssh;
+	ssh.vSetNumSynapses( numSyn );
+	// for ( int i = 0; i < numSyn; ++i )
+		// ssh.addSynapse();
+	assert( static_cast< int >( ssh.vGetNumSynapses() ) == numSyn );
+	ssh.setSeqDt( 1.0 );
+	ssh.setHistoryTime( 5.0 );
+	ssh.setKernelWidth( kernelWidth );
+	ssh.setKernelEquation( "(x == t)*5 + ((x+1)==t || (x-1)==t) * 2 - 1" );
+
+	vector< double > ret = ssh.getKernel();
+	assert( ret.size() == static_cast< unsigned int > (5 * kernelWidth ) );
+	vector< double >::iterator k = ret.begin();
+	for ( int t = 0; t < 5; ++t ) {
+		for ( int x = 0; x < kernelWidth; ++x ) {
+			double val = (x == t)*5 + ((x+1)==t || (x-1)==t) * 2 - 1;
+			assert( doubleEq( *k++, val ) );
+		}
+	}
+
+	cout << "." << flush;
+
+	ssh.setResponseScale( 1.0 );
+	for ( int i = 0; i < numSyn; ++i ) {
+		ssh.addSpike( i, 0.0, 1.0 );
+	}
+	ssh.setWeightScale( 1.0 );
+	ProcInfo p;
+
+	Eref sheller( Id().eref() );
+	Shell* shell = reinterpret_cast< Shell* >( sheller.data() );
+	Id sid = shell->doCreate( "SeqSynHandler", Id(), "sid", 1 );
+	assert( sid.element()->getName() == "sid" );
+	ssh.vProcess( sid.eref(), &p );
+
+	// Here we correlate the vector [1,1,1,1,1,1,1,1,1,1,1] with
+	// the kernel [4,1,-1,-1,-1]
+	// Other lines are zeros.
+	// Should really make the kernel mapping symmetrical.
+	assert( doubleEq( ssh.getSeqActivation(), 28.0 ) );
+	vector< double > wts = ssh.getWeightScaleVec();
+	for ( int i = 0; i < numSyn-4; ++i )
+		assert( doubleEq( wts[i], 2.0 ) );
+	assert( doubleEq( wts[6], 3 ) ); // Edge effects. Last -1 vanishes.
+	assert( doubleEq( wts[7], 4 ) ); // Edge effects. 
+	assert( doubleEq( wts[8], 5 ) ); // Edge effects.
+	assert( doubleEq( wts[9], 4 ) ); // Edge effects.
+		
+	cout << "." << flush;
+	shell->doDelete( sid );
+}
+
+#endif // DO_UNIT_TESTS
 
 // This tests stuff without using the messaging.
 void testSynapse()
 {
+#ifdef DO_UNIT_TESTS
+	testRollingMatrix();
+	testSeqSynapse();
+#endif // DO_UNIT_TESTS
 }
 
 // This is applicable to tests that use the messaging and scheduling.
@@ -28,4 +159,3 @@ void testSynapseProcess()
 {
 }
 
-#endif // DO_UNIT_TESTS
diff --git a/tests/python/chem_models/acc27.g b/tests/python/chem_models/acc27.g
new file mode 100644
index 00000000..04f97d20
--- /dev/null
+++ b/tests/python/chem_models/acc27.g
@@ -0,0 +1,347 @@
+//  DOQCS : http://doqcs.ncbs.res.in/ 
+//  Accession Name = MAPK_osc 
+//  Accession Number = 27 
+//  Transcriber = Sridhar Hariharaputran, NCBS 
+//  Developer = Boris N. Kholodenko 
+//  Species = Xenopus 
+//  Tissue = Oocyte extract 
+//  Cell Compartment = Cytosol 
+//  Notes = This MAPK model is based on <a href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?cmd=Retrieve&db=pubmed&dopt=Abstract&list_uids=10712587">Boris N. Kholodenko Eur J Biochem. (2000) 267(6):1583-8</a> for data from Xenopus oocytes extracts. 
+ 
+ //genesis
+// kkit Version 11 flat dumpfile
+ 
+// Saved on Thu Dec  8 10:58:02 2005
+ 
+include kkit {argv 1}
+ 
+FASTDT = 5e-05
+SIMDT = 0.005
+CONTROLDT = 10
+PLOTDT = 5
+MAXTIME = 6000
+TRANSIENT_TIME = 2
+VARIABLE_DT_FLAG = 1
+DEFAULT_VOL = 1.6667e-21
+VERSION = 11.0
+setfield /file/modpath value /home2/bhalla/scripts/modules
+kparms
+ 
+//genesis
+
+initdump -version 3 -ignoreorphans 1
+simobjdump doqcsinfo filename accessname accesstype transcriber developer \
+  citation species tissue cellcompartment methodology sources \
+  model_implementation model_validation x y z
+simobjdump table input output alloced step_mode stepsize x y z
+simobjdump xtree path script namemode sizescale
+simobjdump xcoredraw xmin xmax ymin ymax
+simobjdump xtext editable
+simobjdump xgraph xmin xmax ymin ymax overlay
+simobjdump xplot pixflags script fg ysquish do_slope wy
+simobjdump group xtree_fg_req xtree_textfg_req plotfield expanded movealone \
+  link savename file version md5sum mod_save_flag x y z
+simobjdump geometry size dim shape outside xtree_fg_req xtree_textfg_req x y \
+  z
+simobjdump kpool DiffConst CoInit Co n nInit mwt nMin vol slave_enable \
+  geomname xtree_fg_req xtree_textfg_req x y z
+simobjdump kreac kf kb notes xtree_fg_req xtree_textfg_req x y z
+simobjdump kenz CoComplexInit CoComplex nComplexInit nComplex vol k1 k2 k3 \
+  keepconc usecomplex notes xtree_fg_req xtree_textfg_req link x y z
+simobjdump stim level1 width1 delay1 level2 width2 delay2 baselevel trig_time \
+  trig_mode notes xtree_fg_req xtree_textfg_req is_running x y z
+simobjdump xtab input output alloced step_mode stepsize notes editfunc \
+  xtree_fg_req xtree_textfg_req baselevel last_x last_y is_running x y z
+simobjdump kchan perm gmax Vm is_active use_nernst notes xtree_fg_req \
+  xtree_textfg_req x y z
+simobjdump transport input output alloced step_mode stepsize dt delay clock \
+  kf xtree_fg_req xtree_textfg_req x y z
+simobjdump proto x y z
+simundump geometry /kinetics/geometry 0 1.6667e-21 3 sphere "" white black 10 \
+  9 0
+simundump group /kinetics/MAPK 0 yellow black x 0 0 "" MAPK \
+  /home2/bhalla/scripts/modules/MAPK_0.g 0 0 0 1 10 0
+simundump kpool /kinetics/MAPK/MAPK 0 0 0.3 0.3 0.3 0.3 0 0 1 0 \
+  /kinetics/geometry 35 yellow -8 -7 0
+simundump kpool /kinetics/MAPK/MKKK 0 0 0.1 0.1 0.1 0.1 0 0 1 0 \
+  /kinetics/geometry 16 yellow -8 5 0
+simundump kpool /kinetics/MAPK/MKK 0 0 0.3 0.3 0.3 0.3 0 0 1 0 \
+  /kinetics/geometry 60 yellow -8 -1 0
+simundump kpool /kinetics/MAPK/int1 0 0 0.001 0.001 0.001 0.001 0 0 1 0 \
+  /kinetics/geometry 30 yellow -4 4 0
+simundump kenz /kinetics/MAPK/int1/2 0 0 0 0 0 0.001 156.25 1 0.25 0 1 "" red \
+  30 "" -4 5 0
+simundump kpool /kinetics/MAPK/MKKK-P 0 0 0 0 0 0 0 0 1 0 /kinetics/geometry \
+  51 yellow 0 5 0
+simundump kenz /kinetics/MAPK/MKKK-P/3 0 0 0 0 0 0.001 8.3333 0.1 0.025 0 1 \
+  "" red 51 "" -4 2 0
+simundump kenz /kinetics/MAPK/MKKK-P/4 0 0 0 0 0 0.001 8.3333 0.1 0.025 0 1 \
+  "" red 51 "" 4 2 0
+simundump kpool /kinetics/MAPK/int3 0 0 0.001 0.001 0.001 0.001 0 0 1 0 \
+  /kinetics/geometry blue yellow -4 -2 0
+simundump kenz /kinetics/MAPK/int3/6 0 0 0 0 0 0.001 250 3 0.75 0 1 "" red \
+  blue "" -4 -1 0
+simundump kpool /kinetics/MAPK/int5 0 0 0.001 0.001 0.001 0.001 0 0 1 0 \
+  /kinetics/geometry 1 yellow -4 -8 0
+simundump kenz /kinetics/MAPK/int5/10 0 0 0 0 0 0.001 166.67 2 0.5 0 1 "" red \
+  1 "" -4 -7 0
+simundump kpool /kinetics/MAPK/MKK-P 0 0 0 0 0 0 0 0 1 0 /kinetics/geometry 5 \
+  yellow 0 -1 0
+simundump kpool /kinetics/MAPK/MAPK-P 0 0 0 0 0 0 0 0 1 0 /kinetics/geometry \
+  55 yellow 0 -7 0
+simundump kpool /kinetics/MAPK/int2 0 0 0.001 0.001 0.001 0.001 0 0 1 0 \
+  /kinetics/geometry 2 yellow 4 -2 0
+simundump kenz /kinetics/MAPK/int2/5 0 0 0 0 0 0.001 250 3 0.75 0 1 "" red 2 \
+  "" 4 -1 0
+simundump kpool /kinetics/MAPK/int4 0 0 0.001 0.001 0.001 0.001 0 0 1 0 \
+  /kinetics/geometry 17 yellow 4 -8 0
+simundump kenz /kinetics/MAPK/int4/9 0 0 0 0 0 0.001 166.67 2 0.5 0 1 "" red \
+  17 "" 4 -7 0
+simundump kpool /kinetics/MAPK/Ras-MKKKK 0 0 0.001 0.001 0.001 0.001 0 0 1 0 \
+  /kinetics/geometry 47 yellow 6 8 0
+simundump kenz /kinetics/MAPK/Ras-MKKKK/1 0 0 0 0 0 0.001 1250 10 2.5 0 1 "" \
+  red 47 "" -4 8 0
+simundump kpool /kinetics/MAPK/inactiveRas-MKKK 0 0 0 0 0 0 0 0 1 0 \
+  /kinetics/geometry 30 yellow 11 8 0
+simundump kreac /kinetics/MAPK/Neg_feedback 0 1 0.009 "" white yellow 11 2 0
+simundump kpool /kinetics/MAPK/MKK-PP 0 0 0 0 0 0 0 0 1 0 /kinetics/geometry \
+  60 yellow 8 -1 0
+simundump kenz /kinetics/MAPK/MKK-PP/7 0 0 0 0 0 0.001 8.3333 0.1 0.025 0 1 \
+  "" red 60 "" -4 -4 0
+simundump kenz /kinetics/MAPK/MKK-PP/8 0 0 0 0 0 0.001 8.3333 0.1 0.025 0 1 \
+  "" red 60 "" 4 -4 0
+simundump kpool /kinetics/MAPK/MAPK-PP 0 0 0 0 0 0 0 0 1 0 /kinetics/geometry \
+  46 yellow 8 -7 0
+simundump doqcsinfo /kinetics/doqcsinfo 0 db27.g MAPK_osc pathway \
+  "Sridhar Hariharaputran, NCBS" " Boris N. Kholodenko" "citation here" \
+  Xenopus "Oocyte extract" Cytosol Hypothetical \
+  "<a href=http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?cmd=Retrieve&db=pubmed&dopt=Abstract&list_uids=10712587>Boris N. Kholodenko Eur J Biochem. (2000) 267(6):1583-8</a> ( peer-reviewed publication )" \
+  "Mathematically equivalent" "Qualitative predictions" 10 11 0
+simundump xgraph /graphs/conc1 0 0 6000 0 0.3 0
+simundump xgraph /graphs/conc2 0 0 6000 4.5157e-05 0.3 0
+simundump xplot /graphs/conc1/MAPK-PP.Co 3 524288 \
+  "delete_plot.w <s> <d>; edit_plot.D <w>" 46 0 0 1
+simundump xplot /graphs/conc1/MAPK.Co 3 524288 \
+  "delete_plot.w <s> <d>; edit_plot.D <w>" 35 0 0 1
+simundump xplot /graphs/conc2/Ras-MKKKK.Co 3 524288 \
+  "delete_plot.w <s> <d>; edit_plot.D <w>" 47 0 0 1
+simundump xplot /graphs/conc2/MAPK.Co 3 524288 \
+  "delete_plot.w <s> <d>; edit_plot.D <w>" 35 0 0 1
+simundump xplot /graphs/conc2/MKKK.Co 3 524288 \
+  "delete_plot.w <s> <d>; edit_plot.D <w>" 16 0 0 1
+simundump xplot /graphs/conc2/MKK.Co 3 524288 \
+  "delete_plot.w <s> <d>; edit_plot.D <w>" 60 0 0 1
+simundump xgraph /moregraphs/conc3 0 0 6000 0 1 0
+simundump xgraph /moregraphs/conc4 0 0 6000 0 1 0
+simundump xcoredraw /edit/draw 0 -10 13 -10 13
+simundump xtree /edit/draw/tree 0 \
+  /kinetics/#[],/kinetics/#[]/#[],/kinetics/#[]/#[]/#[][TYPE!=proto],/kinetics/#[]/#[]/#[][TYPE!=linkinfo]/##[] \
+  "edit_elm.D <v>; drag_from_edit.w <d> <S> <x> <y> <z>" auto 0.6
+simundump xtext /file/notes 0 1
+xtextload /file/notes \
+"22 Jan 2002" \
+" " \
+" This model is based on Kholodenko, B.N." \
+"      Eur. J. Biochem. 267, 1583-1588(2000)" \
+""
+addmsg /kinetics/MAPK/MKK-PP/7 /kinetics/MAPK/MAPK REAC sA B 
+addmsg /kinetics/MAPK/int5/10 /kinetics/MAPK/MAPK MM_PRD pA 
+addmsg /kinetics/MAPK/Ras-MKKKK/1 /kinetics/MAPK/MKKK REAC sA B 
+addmsg /kinetics/MAPK/int1/2 /kinetics/MAPK/MKKK MM_PRD pA 
+addmsg /kinetics/MAPK/MKKK-P/3 /kinetics/MAPK/MKK REAC sA B 
+addmsg /kinetics/MAPK/int3/6 /kinetics/MAPK/MKK MM_PRD pA 
+addmsg /kinetics/MAPK/int1 /kinetics/MAPK/int1/2 ENZYME n 
+addmsg /kinetics/MAPK/MKKK-P /kinetics/MAPK/int1/2 SUBSTRATE n 
+addmsg /kinetics/MAPK/Ras-MKKKK/1 /kinetics/MAPK/MKKK-P MM_PRD pA 
+addmsg /kinetics/MAPK/int1/2 /kinetics/MAPK/MKKK-P REAC sA B 
+addmsg /kinetics/MAPK/MKKK-P /kinetics/MAPK/MKKK-P/3 ENZYME n 
+addmsg /kinetics/MAPK/MKK /kinetics/MAPK/MKKK-P/3 SUBSTRATE n 
+addmsg /kinetics/MAPK/MKKK-P /kinetics/MAPK/MKKK-P/4 ENZYME n 
+addmsg /kinetics/MAPK/MKK-P /kinetics/MAPK/MKKK-P/4 SUBSTRATE n 
+addmsg /kinetics/MAPK/int3 /kinetics/MAPK/int3/6 ENZYME n 
+addmsg /kinetics/MAPK/MKK-P /kinetics/MAPK/int3/6 SUBSTRATE n 
+addmsg /kinetics/MAPK/int5 /kinetics/MAPK/int5/10 ENZYME n 
+addmsg /kinetics/MAPK/MAPK-P /kinetics/MAPK/int5/10 SUBSTRATE n 
+addmsg /kinetics/MAPK/MKKK-P/4 /kinetics/MAPK/MKK-P REAC sA B 
+addmsg /kinetics/MAPK/MKKK-P/3 /kinetics/MAPK/MKK-P MM_PRD pA 
+addmsg /kinetics/MAPK/int3/6 /kinetics/MAPK/MKK-P REAC sA B 
+addmsg /kinetics/MAPK/int2/5 /kinetics/MAPK/MKK-P MM_PRD pA 
+addmsg /kinetics/MAPK/MKK-PP/8 /kinetics/MAPK/MAPK-P REAC sA B 
+addmsg /kinetics/MAPK/MKK-PP/7 /kinetics/MAPK/MAPK-P MM_PRD pA 
+addmsg /kinetics/MAPK/int5/10 /kinetics/MAPK/MAPK-P REAC sA B 
+addmsg /kinetics/MAPK/int4/9 /kinetics/MAPK/MAPK-P MM_PRD pA 
+addmsg /kinetics/MAPK/int2 /kinetics/MAPK/int2/5 ENZYME n 
+addmsg /kinetics/MAPK/MKK-PP /kinetics/MAPK/int2/5 SUBSTRATE n 
+addmsg /kinetics/MAPK/int4 /kinetics/MAPK/int4/9 ENZYME n 
+addmsg /kinetics/MAPK/MAPK-PP /kinetics/MAPK/int4/9 SUBSTRATE n 
+addmsg /kinetics/MAPK/Neg_feedback /kinetics/MAPK/Ras-MKKKK REAC A B 
+addmsg /kinetics/MAPK/Ras-MKKKK /kinetics/MAPK/Ras-MKKKK/1 ENZYME n 
+addmsg /kinetics/MAPK/MKKK /kinetics/MAPK/Ras-MKKKK/1 SUBSTRATE n 
+addmsg /kinetics/MAPK/Neg_feedback /kinetics/MAPK/inactiveRas-MKKK REAC B A 
+addmsg /kinetics/MAPK/MAPK-PP /kinetics/MAPK/Neg_feedback SUBSTRATE n 
+addmsg /kinetics/MAPK/Ras-MKKKK /kinetics/MAPK/Neg_feedback SUBSTRATE n 
+addmsg /kinetics/MAPK/inactiveRas-MKKK /kinetics/MAPK/Neg_feedback PRODUCT n 
+addmsg /kinetics/MAPK/MKKK-P/4 /kinetics/MAPK/MKK-PP MM_PRD pA 
+addmsg /kinetics/MAPK/int2/5 /kinetics/MAPK/MKK-PP REAC sA B 
+addmsg /kinetics/MAPK/MKK-PP /kinetics/MAPK/MKK-PP/7 ENZYME n 
+addmsg /kinetics/MAPK/MAPK /kinetics/MAPK/MKK-PP/7 SUBSTRATE n 
+addmsg /kinetics/MAPK/MKK-PP /kinetics/MAPK/MKK-PP/8 ENZYME n 
+addmsg /kinetics/MAPK/MAPK-P /kinetics/MAPK/MKK-PP/8 SUBSTRATE n 
+addmsg /kinetics/MAPK/MKK-PP/8 /kinetics/MAPK/MAPK-PP MM_PRD pA 
+addmsg /kinetics/MAPK/int4/9 /kinetics/MAPK/MAPK-PP REAC sA B 
+addmsg /kinetics/MAPK/Neg_feedback /kinetics/MAPK/MAPK-PP REAC A B 
+addmsg /kinetics/MAPK/MAPK-PP /graphs/conc1/MAPK-PP.Co PLOT Co *MAPK-PP.Co *46 
+addmsg /kinetics/MAPK/MAPK /graphs/conc1/MAPK.Co PLOT Co *MAPK.Co *35 
+addmsg /kinetics/MAPK/Ras-MKKKK /graphs/conc2/Ras-MKKKK.Co PLOT Co *Ras-MKKKK.Co *47 
+addmsg /kinetics/MAPK/MAPK /graphs/conc2/MAPK.Co PLOT Co *MAPK.Co *35 
+addmsg /kinetics/MAPK/MKKK /graphs/conc2/MKKK.Co PLOT Co *MKKK.Co *16 
+addmsg /kinetics/MAPK/MKK /graphs/conc2/MKK.Co PLOT Co *MKK.Co *60 
+enddump
+// End of dump
+
+call /kinetics/MAPK/notes LOAD \
+"This is the oscillatory MAPK model from Kholodenko 2000" \
+"Eur J. Biochem 267:1583-1588" \
+"The original model is formulated in terms of idealized" \
+"Michaelis-Menten enzymes and the enzyme-substrate complex" \
+"concentrations are therefore assumed negligible. The" \
+"current implementation of the model uses explicit enzyme" \
+"reactions involving substrates and is therefore an" \
+"approximation to the Kholodenko model. The approximation is" \
+"greatly improved if the enzyme is flagged as Available" \
+"which is an option in Kinetikit. This flag means that the" \
+"enzyme protein concentration is not reduced even when it" \
+"is involved in a complex. However, the substrate protein" \
+"continues to participate in enzyme-substrate complexes" \
+"and its concentration is therefore affected. Overall," \
+"this model works almost the same as the Kholodenko model" \
+"but the peak MAPK-PP amplitudes are a little reduced and" \
+"the period of oscillations is about 10% longer." \
+"If the enzymes are  not flagged as Available then the" \
+"oscillations commence only when the Km for enzyme 1" \
+"is set to 0.1 uM."
+call /kinetics/MAPK/MAPK/notes LOAD \
+"The total concn. of MAPK is 300nM " \
+"from" \
+"Kholodenko, 2000."
+call /kinetics/MAPK/MKKK/notes LOAD \
+"The total concn. of MKKK is 100nM " \
+"from" \
+"Kholodenko, 2000"
+call /kinetics/MAPK/MKK/notes LOAD \
+"The total concn. of MKK is 300nM " \
+"from" \
+"Kholodenko,2000"
+call /kinetics/MAPK/int1/notes LOAD \
+"This is the intermediate enzyme which catalyses the " \
+"dephosphorylation of MKKK-P to MKKK. The concentration" \
+"is set to 1 nM based on" \
+"from" \
+"Kholodenko, 2000"
+call /kinetics/MAPK/int1/2/notes LOAD \
+"Km is 8nM and Vmax is 0.25nM.s-1 " \
+"from" \
+"Kholodenko, 2000."
+call /kinetics/MAPK/MKKK-P/notes LOAD \
+"This is the phosphorylated form of MKKK which converts MKK" \
+"to MKK-P and then to MKK-PP" \
+"from" \
+"Kholodenko, 2000."
+call /kinetics/MAPK/MKKK-P/3/notes LOAD \
+"Km is 15 nM and Vmax is 0.025s-1" \
+"from" \
+"Kholodenko, 2000"
+call /kinetics/MAPK/MKKK-P/4/notes LOAD \
+"Km is 15nM and Vmax is 0.025s-1" \
+"from " \
+"Kholodenko, 2000."
+call /kinetics/MAPK/int3/notes LOAD \
+"This intermediate enzyme catalyses the dephosphorylation of" \
+"MKK-P to MKK. The concentration is 1nM" \
+"from" \
+"Kholodenko, 2000"
+call /kinetics/MAPK/int3/6/notes LOAD \
+"The Km is 15nM and the Vmax is 0.75nM.s-1" \
+"from" \
+"Kholodenko 2000."
+call /kinetics/MAPK/int5/notes LOAD \
+"This catalyses the conversion of MAPK-P to MAPK. The " \
+"concenration is 1nM." \
+"from" \
+"Kholodenko, 2000"
+call /kinetics/MAPK/int5/10/notes LOAD \
+"The Km is 15nM and Vmax is 0.5nM.s-1" \
+"from" \
+"Kholodenko, 2000"
+call /kinetics/MAPK/MKK-P/notes LOAD \
+"This is the single phoshorylated form of MKK." \
+"from" \
+"Kholodenko, 2000."
+call /kinetics/MAPK/MAPK-P/notes LOAD \
+"This is the single phopshorylated form of MAPK" \
+"from" \
+"Kholodenko, 2000."
+call /kinetics/MAPK/int2/notes LOAD \
+"This intermediate enzyme which catalyses the dephosphorylation of" \
+"MKK-PP to MKK-P. The concentration is 1nM." \
+"from" \
+"Kholodenko, 2000"
+call /kinetics/MAPK/int2/5/notes LOAD \
+"The Km is 15nM and Vmax is 0.75nM.s-1 " \
+"from" \
+"Kholodenko, 2000" \
+""
+call /kinetics/MAPK/int4/notes LOAD \
+"This intermediate enzyme catalyses the dephosphorylation of" \
+"MAPK-PP to MAPK-P. The concentration is 1nM." \
+"from" \
+"Kholodenko, 2000"
+call /kinetics/MAPK/int4/9/notes LOAD \
+"The Km is 15nM and Vmax is 0.5nM.s-1 " \
+"from" \
+"Kholodenko, 2000"
+call /kinetics/MAPK/Ras-MKKKK/notes LOAD \
+"The concn. of Ras-MKKKK* is set to 1 nM implicitly" \
+"from" \
+"Kholodenko, 2000"
+call /kinetics/MAPK/Ras-MKKKK/1/notes LOAD \
+"The Km is 10nM and Vmax is 2.5nM sec^-1.  We  assume that" \
+"there is 1 nM of the Ras-MKKKK." \
+"From Kholodenko, 2000." \
+"" \
+"If the enzymes are not flagged as Available, then this" \
+"Km should be set to 0.1 to obtain oscillations."
+call /kinetics/MAPK/inactiveRas-MKKK/notes LOAD \
+"This is the inactive form of Ras-MKKK. Based on the" \
+"reaction scheme from Kholodenko 2000, this is equivalent" \
+"to a binding of the MAPK-PP to the Ras. The amount of" \
+"Ras in the model is small enough that negligible amounts" \
+"of MAPK are involved in this reaction. So it is a fair" \
+"approximation to the negative feedback mechanism from" \
+"Kholodenko, 2000."
+call /kinetics/MAPK/Neg_feedback/notes LOAD \
+"From Kholodenko, 2000 Eur J Biochem  267" \
+"the Kd is 9 nM. We use a rather fast Kf of 1/sec/uM" \
+"so that equilibrium is maintained." \
+""
+call /kinetics/MAPK/MKK-PP/notes LOAD \
+"This is the double phosphorylated and active form of MKK" \
+"from" \
+"Kholodenko, 2000"
+call /kinetics/MAPK/MKK-PP/7/notes LOAD \
+"The Km is 15nM which is 0.015uM Vmax is 0.025s-1" \
+"from" \
+"Kholodenko, 2000." \
+""
+call /kinetics/MAPK/MKK-PP/8/notes LOAD \
+"The Km is 15nM which is 0.015uM and Vmax is 0.025s-1" \
+"from" \
+"Kholodenko, 2000" \
+""
+call /kinetics/MAPK/MAPK-PP/notes LOAD \
+"This is the double phosphorylated and active form of MAPK." \
+"from" \
+"Kholodenko, 2000."
+call /kinetics/doqcsinfo/notes LOAD \
+"This MAPK model is based on <a href=http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?cmd=Retrieve&db=pubmed&dopt=Abstract&list_uids=10712587>Boris N. Kholodenko Eur J Biochem. (2000) 267(6):1583-8</a> for data from Xenopus oocytes extracts."
+complete_loading
diff --git a/tests/python/test_sbml.py b/tests/python/test_sbml.py
index c297f091..800d1e12 100644
--- a/tests/python/test_sbml.py
+++ b/tests/python/test_sbml.py
@@ -2,7 +2,7 @@
 Test SBML capabilities of PyMOOSE
 """
     
-__author__           = "Dilawar Singh"
+__author__           = "Dilawar Singh, HarshaRani"
 __copyright__        = "Copyright 2015, Dilawar Singh and NCBS Bangalore"
 __credits__          = ["NCBS Bangalore"]
 __license__          = "GNU GPL"
@@ -16,20 +16,20 @@ import os
 
 import moose
 import moose.utils as mu
-
+from moose.SBML import *
 # the model lives in the same directory as the test script
 modeldir = os.path.dirname(__file__)
 
 def main():
-    modelname = os.path.join(modeldir, 'chem_models/mkp1_feedback_effects_acc4.xml')
-    model = moose.readSBML(modelname, '/model')
-    tables = moose.wildcardFind('/##[TYPE=Table2]')
+    modelname = os.path.join(modeldir, 'chem_models/00001-sbml-l3v1.xml')
+    model = mooseReadSBML(modelname, '/sbml')
+    tables = moose.wildcardFind('/sbml/##[TYPE=Table2]')
     records = {}
     for t in tables: records[t.path.split('/')[-1]] = t
     c = moose.Clock('/clock')
     moose.reinit()
     moose.start(200)
-    check(tables)
+    #check(tables)
 
 def check(tables):
     assert len(tables) > 0, "No moose.Table2 created."
diff --git a/tests/python/test_sbml_support.py b/tests/python/test_sbml_support.py
index 2ecd4428..7d30a9d3 100644
--- a/tests/python/test_sbml_support.py
+++ b/tests/python/test_sbml_support.py
@@ -40,13 +40,9 @@
 # 
 
 import moose
-import matplotlib
-import numpy as np
-import matplotlib.pyplot as plt
 import sys
-import pylab
 import os
-
+from moose.SBML import *
 script_dir = os.path.dirname( os.path.realpath( __file__) )
 
 print( "Using moose from: %s" % moose.__file__ )
@@ -59,31 +55,32 @@ def main():
 	As a general rule we created model under '/path/model' and plots under '/path/graphs'.\n
     """
 
-    mfile =  os.path.join( script_dir, 'chem_models/00001-sbml-l3v1.xml')
+    mfile =  os.path.join( script_dir, 'chem_models/acc27.g')
     runtime = 20.0
-        
-    # Loading the sbml file into MOOSE, models are loaded in path/model
-    sbmlId = moose.readSBML(mfile,'sbml')
+    writefile =  os.path.join( script_dir, 'chem_models/acc27.xml')    
+    
+    #Load model to moose and write to SBML
+    moose.loadModel(mfile,'/acc27')
+    writeerror,message,sbmlId = moose.SBML.mooseWriteSBML('/acc27',writefile)
+    if writeerror == -2:
+        print ( "Could not save the Model" )
+    elif writeerror == -1:
+        print ( "\n This model is not valid SBML Model, failed in the consistency check ")
+    elif writeerror == 0:
+        print ("Could not save the Model ")
+    elif writeerror == 1:
+        print ( "Model is loaded using \'loadModel\' function to moose and using \'moose.SBML.mooseWriteSBML\' converted to SBML. \n Ran for 20 Sec" )
+        # Reset and Run
+        moose.reinit()
+        moose.start(runtime)
     
-
-    s1 = moose.element('/sbml/model/compartment/S1')
-    s2= moose.element('/sbml/model/compartment/S2')
-                      
-    # Creating MOOSE Table, Table2 is for the chemical model
-    graphs = moose.Neutral( '/sbml/graphs' )
-    outputs1 = moose.Table2 ( '/sbml/graphs/concS1')
-    outputs2 = moose.Table2 ( '/sbml/graphs/concS2')
-
-    # connect up the tables
-    moose.connect( outputs1,'requestOut', s1, 'getConc' );
-    moose.connect( outputs2,'requestOut', s2, 'getConc' );
-
-        
-    # Reset and Run
-    moose.reinit()
-    moose.start(runtime)
 
 def displayPlots():
+    import matplotlib
+    import numpy as np
+    import matplotlib.pyplot as plt
+    import pylab
+
     # Display all plots.
     for x in moose.wildcardFind( '/sbml/graphs/#[TYPE=Table2]' ):
         t = np.arange( 0, x.vector.size, 1 ) #sec
-- 
GitLab