Contributing to NTC templates

add or edit ntc-templates

26 October 2025   12 min read

This guide is the steps I follow when adding or updating NTC templates. Contributing to a project in Github is still a learning curve for me, the days of learning CLI by repetition seem long gone so when using or contributing to any of these NetOps type tools I have to keep guides as it is a bit of a struggle to remember with so many new and alien things to know and the sporadic nature that I use them.


Table Of Contents



The Basics

logo

If you haven’t used NTC templates before in short it is an easy way to get structured data from devices rather than just screen scrapping the CLI output. Although there are already a multitude of templates out there, at some point you are going to come across a feature that doesn’t have a template or have an issue with an existing template due deviations in the device output. In either of these situations the quickest way to fix it is to contribute to the Github project in the form of a Pull Request (PR).

The master branch is the primary branch to develop off with PRs for new features and bug fixes sourced from this. You should only work on a single template per-PR, don’t group multiple templates into the one PR.

The elements involved in an individual NTC-template are as follows:

  • A textFSM template (./templates) in the format vendor_os_cmd.textfsm
  • An entry in the index file, this is what binds the template to the command being run
  • Test files (./tests/vendor_os/cmd_name/):
    • Raw output of the CLI command to be parsed (vendor_os_cmd.raw)
    • YAML file containing the expected parsed structured data (vendor_os_cmd.yml)

Development environment

The NTC-template development environment is used to create the structured data YAML file from the raw command output as well as run the test suite of formatting/linting (black, flake8, yamllint, etc) and unit testing (pytest). The first thing todo is to fork the NTC-templates repo by clicking Fork at the top right, clone your fork locally and add the original Git repo as an upstream repository (name can be anything, I use upstream).

git clone https://github.com/sjhloco/ntc-templates.git
cd ntc-templates
git remote add upstream https://github.com/networktocode/ntc-templates.git
git remote -v

You need to have Poetry, poetry-plugin-shell and Docker already installed to proceed with setting up the development environment.

Setup Python virtual environment

Run following commands to create and enable a virtual environment managed by Poetry (like python -m venv .venv and source .venv/bin/activate) before installing the project and development dependencies (like pip install -r requirements.txt). The pyproject.toml file tool.poetry.dependencies and tool.poetry.dev-dependencies sections hold all the dependencies that are to be installed.

cd ntc-templates/
poetry shell
poetry install

As with python virtual environments each time you want to use the NTC develop tools you must enable the poetry environment (poetry shell).

Build test container

Docker containers are used to perform the projects linting/formatting and unit testing requirements (built and run using invoke).

invoke build

The container image is a debian based image with poetry and all the the relevant packages installed.

$ docker image ls
REPOSITORY TAG IMAGE ID CREATED SIZE
ntc_templates 8.1.0-py3.9 d2b477b8732a 2 weeks ago 849MB

When you use invoke test it runs the different linting/formatting and unit testing tools by passing the command in as it builds the containers.

$ docker container ls -a
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
54cba8929c2a ntc_templates:8.1.0-py3.9 sh -c pytest 3 days ago Exited (0) 3 days ago pensive_dubinsky
3974fd25d671 ntc_templates:8.1.0-py3.9 sh -c 'bandit --rec… 3 days ago Exited (0) 3 days ago frosty_bhabha
0b19e7287ac8 ntc_templates:8.1.0-py3.9 sh -c 'pydocstyle .' 3 days ago Exited (0) 3 days ago pensive_perlman
f8b6c302e9e1 ntc_templates:8.1.0-py3.9 sh -c 'yamllint .' 3 days ago Exited (0) 3 days ago upbeat_pare
8d582503324a ntc_templates:8.1.0-py3.9 sh -c 'find . -name… 3 days ago Exited (0) 3 days ago competent_aryabhata
03ad6b23d6bc ntc_templates:8.1.0-py3.9 sh -c 'flake8 . --c… 3 days ago Exited (0) 3 days ago eloquent_hamilton
5f1cb15ddf36 ntc_templates:8.1.0-py3.9 sh -c 'black --chec… 3 days ago Exited (0) 3 days ago nostalgic_mendeleev

TextFSM

TextFSM is a Python module created by Google to do template based parsing of the data that comes from network devices. It is designed for non-static data that can change (like CLI command outputs), it is heavily reliant on RegEx to parse this data.

  • A TextFSM template is a set of instructions that tells the parsing tool what patterns to look for in the text and how to organize the data it finds
  • TextFSM uses RegEx patterns to compare incoming strings against rules and executes actions on the matched lines

There are plenty of great blogs out there explaining TextFSM, below is just a few key points to help with understand the logic:

  • Values (capture group): Variables to hold the extracted data, are the dictionary key names in the produced data structure. It is a good idea to look at existing templates for the same feature to try and keep some type of naming conformity. The names are followed by a '(RegEx)' pattern which is how TextFSM knows where the data value starts and ends, a few useful examples:
    • (\S+): Contiguous non-whitespace, stops at first space
    • (\d+): Matches any numeric digits 1 or more times (\D for characters that are not digits)
    • (\d*|): Allows either digits or an empty string
    • ([\w./]+): Matches any word character, includes letters, digits, underscores, dots, or forward slashes
    • Filldown: Place before '(RegEx)' to make the matched value appear in all lines (for example put the VRF name in all routing table entries)
  • Start (state): The initial state, it is either used to immediately start capturing data (ends with -> Record) or if lines are to be ignored (such as table headers) skip those lines and pass the matching off to an additional state (-> StateName), a new section titled with that StateName.
    • Rule: Within a state you define matching rules that tell TextFSM what to do when it encounters certain input lines, what action to perform.
      • Matches lines based on regex (start with ^\s to allow for whitespaces) inserting capture group ${NAMES} to capture the data you want
      • Due to the default action (Next.NoRecord) any lines without an action will be skipped (go to next input line)
      • Should always have ^\s*$$ to match any blank lines, whitespace or nothing (first $ is regex end of the line, second $ is TextFSM syntax to mark to end the regex token)
    • Action: Based on matching rules (all elements in line of command output matching rule RegEx patterns) an action is performed before moving on to the next input line to start comparing again. Some of the most common actions are:
      • -> Continue: Another way to ignore headers, blank lines, or separators as it continues to process rules with the current input line as if no match happened, values matched here are still assigned
      • -> Record: Writes a record of the values collected so far and reset all variables except those with a Filldown option. This could be set at the end of a line or normally on the last line of the block of data you care about
      • -> StateName: State transition moves rule matching to another state (custom name) defined in the template. First all actions (line and record) are executed, then the next line is read and finally the current state changes to the new state and rule matching (with the newly read input line) continues in the new state
      • -> Error: Stops all line processing, discards everything collected so far and throws an Exception. Should always end with this as it helps to ensure that every line is accounted for within the raw output for the command

Online tools

There are a couple of online tools that can assist in creating templates, much like with online Jinja tools you just need to input the data and your TextFSM template and it will automatically generate the structured data output.

  • work template builder: Works pretty well, you need to re-enter the data if making big changes otherwise output is limited to only first object
  • slurpit template builder: Comes with AI builtin, although was pretty useless unless a very simple template. The really good thing about this is that you can run it locally offline (docker or python script) and plug it in to ChatGPT

Speaking of AI, I thought that I could just use ChatGPT to generate my templates but it wasn’t very good with anything complex. I found the best way was to use ChatGPT to get the initial format, then tune it into the layout I wanted by going through each bit in small steps as the majority of time what it supplied didn’t work.

Contributing to NTC templates

Whether you are adding a new template or fixing an issue on an existing template both workflows follow a similar pattern ending in a PR. If your unsure on the naming or structure you can always look through closed PRs to see how others have done this. Before you start do a pull from the upstream repository into your local repo to make sure you are creating branches from the latest copy of the master branch of the remote repo.

git pull

Adding a new NTC template

This working example is to add a new template for the Palo Alto ‘show routing resource’ command. If unsure of naming have a look at how the current templates are named:

$ ls ntc_templates/templates/ | grep -m1 palo
paloalto_panos_debug_swm_status.textfsm
$ ls tests/paloalto_panos | grep -m1 show
show_arp_all
$ ls tests/paloalto_panos/show_arp_all
paloalto_panos_show_arp_all.raw paloalto_panos_show_arp_all.yml

1. Structure: Create a feature branch for the template (only do one template per branch/PR) and add to it the .textfsm template file, test folder and .raw command output file (the .yml data structure file is automatically created later).

git checkout -b add_paloalto_panos_show_routing_resource

touch ntc_templates/templates/paloalto_panos_show_routing_resource.textfsm
mkdir tests/paloalto_panos/show_routing_resource
touch tests/paloalto_panos/show_routing_resource/paloalto_panos_show_routing_resource.raw

2. Index: Add the CLI command to the index file (ntc_templates/templates/index), this is what NTC-templates uses under the hood to bind the TextFSM templates to the commands being run. The columns are Template, Hostname, Platform and Command, the regex in this last column is used to catch any abbreviations of the command. The entries must be ordered as follows:

  • OS in alphabetical order, keep a space between OS’s
  • Template name in length order (longest to shortest), when length is the same use alphabetical order of command name
paloalto_panos_show_routing_resource.textfsm, .*, paloalto_panos, sh[[ow]] ro[[uting]] res[[ource]]

3. Command Output: Add the devices command output (don’t include the cmd itself) to the .raw file in the tests directory.

  • tests/paloalto_panos/show_routing_resource/paloalto_panos_show_routing_resource.raw

4. TextFSM template: Based off the command output create the textFSM template, it is easier to do this using the slurpit template builder before adding the resulting template here.

  • ntc_templates/templates/paloalto_panos_show_routing_resource.textfsm

5. Structured data: Use the gen-yaml-file script to generate the .yaml structure data file (uses the .textfsm and .raw files from previous steps).

  • tests/paloalto_panos/show_routing_resourcepaloalto_panos_show_routing_resource.yaml
python cli.py gen-yaml-file -f tests/paloalto_panos/show_routing_resource/paloalto_panos_show_routing_resource.raw

6. Pytest: Unit test the functionality of the newly created template.

pytest -vv tests/test_structured_data_against_parsed_reference_files.py::test_raw_data_against_mock[tests/paloalto_panos/show_routing_resource/paloalto_panos_show_routing_resource.raw]

Fixing an existing NTC Template

This working example is to fix an issue with the Cisco IOS ‘show redundancy’ template caused by extra lines being in the 9300 Stackwise output.

1. Structure: Create a feature branch under which the fix will be applied, again only do one issue per branch/PR.

git checkout -b fix_cisco_ios_show_redundancy

2. Command Output: Add an additional .raw file to the tests folder with the command output that fails and try to generate a .yml file, it should fail with the same error you saw when using ntc-templates.

nano tests/cisco_ios/show_redundancy/cisco_ios_show_redundancy4.raw
>>> add the additional output to the file
python cli.py gen-yaml-file -f tests/cisco_ios/show_redundancy/cisco_ios_show_redundancy4.raw

3. TextFSM template: Update the template with the fixes and run again, it should now pass creating a .yml file from the new command output.

python cli.py gen-yaml-file -f tests/cisco_ios/show_redundancy/cisco_ios_show_redundancy4.raw

Once you are happy with the output run the generation tool for all files in the tests folder to add new variable (if any) and to make sure your change doesn’t break the existing template. You shouldn’t be seeing any changes to these files unless you have added extra variables.

python cli.py gen-yaml-folder -f tests/cisco_ios/show_redundancy

4. Pytest: Run unit tests against new file and also any existing files if they changed as part of step3.

pytest -vv tests/test_structured_data_against_parsed_reference_files.py::test_raw_data_against_mock[tests/cisco_ios/show_redundancy/show_redundancy.raw]
pytest -vv tests/test_structured_data_against_parsed_reference_files.py::test_raw_data_against_mock[tests/cisco_ios/show_redundancy/show_redundancy4.raw]

Create the Pull Request

PR creation is the same process for adding a new template or fixing an issue with an existing template.

1. Test: Use the Docker development environment to ensure your changes abide to the projects formatting/formatting rules as well as unit test the functionality.

invoke tests

2. Git commit: Stage and commit the changes to your local branch. As at this point it is still only a local branch, if you need to make changes after your first commit you can always amend them to that commit (rather than new commit) to keep the history clean (git commit –amend –no-edit).

git add .
git commit -m 'Added new template for paloalto_panos show routing resource'

3. Git Fetch and Push: First make sure your local branch is up to date with the remote (upstream) master branch, if it is not rebase remote master into your local feature branch (for local only branches rebase keeps history cleaner than a merge).

git checkout master
git fetch upstream
git checkout add_paloalto_panos_show_routing_resource
git rebase upstream/master

You can now push the new branch up to your Github fork (origin).

git push origin add_paloalto_panos_show_routing_resource

4. Create PR: Back in Github you will now have a “Compare & pull request” button at the top of your fork, there will be one for each branch pushed. Before submitting the PR you get the chance to compare the diff of your changes and expand on the reasons for the PR in the comments.

To update a PR that has not yet been approved all you need to do is update your local branch (commit) and push it to your fork, Github will automatically update the PR (will see the new commit there). If NTC have requested changes it is likely they merged the master branch into your PR branch so before making changes pull down the merged branch.

git pull origin add_paloalto_panos_show_routing_resource

poetry shell
python cli.py gen-yaml-file -f tests/paloalto_panos/show_routing_resource/paloalto_panos_show_routing_resource.raw
invoke tests

git add .
git commit -m 'Updated blah, blah, blah'
git push origin add_paloalto_panos_show_routing_resource

5. Cleanup: Once the PR has been successfully merged and closed by NTC (will see in the PR) pull down latest version of the remote master branch (from upstream), merge it into your local master branch and push the changes backup up to your Github fork (origin).

git checkout master
git fetch upstream
git merge upstream/master
git push origin master

The feature branch is no longer needed so delete it locally as well in your remote fork.

git branch -D add_paloalto_panos_show_routing_resource
git push origin -d add_paloalto_panos_show_routing_resource

Reference