Hello and Happy New Year everyone! Network to Code is very excited to start this new year presenting at Networking Field Day 27. Here is a list of our speakers and their topics:
John Anderson – Nautobot App Platform Overview
This session briefly covers Nautobot as a Source of Truth (SoT) and then moves on to cover, in depth
Nautobot as an App Platform
Why the concept of a platform is important
Open source Nautobot apps currently available
Tim Fiola – Automating Circuit Maintenances
Managing circuit maintenances and notifications is a well-known problem by network operators. Circuit maintenances are scheduled by Network Service Providers (NSPs) and temporarily affect the actual state of a circuit. There are steps that must be done to ensure network operators and automation tools know the state of a circuit that is ultimately dictated by an NSP.
This session explains how you can let Nautobot automatically take care of fetching notifications, normalizing their data (most common case is that every NSP has its own notification template), and finally updating the SoT to keep track of past, present, and future changes related to circuit maintenances.
Tim Schreyack – NetDevOps-Driven Configuration Compliance
Network Automation is a journey and there are still growing trends around NetDevOps and learning skills and tools such as JSON, YAML, Jinja2 templating, Python, and Ansible.
This session walks through how you can take a gradual path to learning and how you can apply those skills to achieve NetDevOps-Driven Configuration Compliance.
Tim Schreyack – Using ChatOps to Talk to Your Network
With network automation, you must consider how users will interact with the automation infrastructure. Who opens every network request you work on? How often are you fielding requests just to get data and relay it to another team? Wouldn’t it be great to have a bot respond to requests to bounce a port, check an interface’s VLANs, check a rack elevation diagram, view inventory, view dashboards in tools like Grafana, and a whole lot more?!
This session showcases network-centric ChatOps showcasing demos that combine Microsoft Teams, Webex Teams, Slack, and Mattermost chatting with Arista CloudVision, Cisco ACI, Cisco Meraki, Grafana, Nautobot, IP Fabric, Kentik, and Ansible, and more!
John Anderson – Synchronizing Data to Create a Single Source of Truth
It is extremely important that the Source of Truth (SoT) hold data on the intended state of the network. However, it is near impossible to store all authoritative data required to manage network configurations in any single application. This is why it is critical to have a framework that synchronizes data across multiple authoritative sources for different types of data into your SoT, including:
IPAM/DDI platforms
CMDBs
Circuit databases
Any other tool that is the authoritative source for a specific dataset
In this sense, the SoT acts as an aggregation layer across all your authoritative sources. This aggregation allows users to leverage existing tools while getting the data into a Single Source of Truth that then has a unified view into ALL data and can be used to power network automation.
Conclusion
Please do join us and tune in at this link at 10:30am-12:30pm Pacific (10:30-12:30 PT / 1:30pm-3:30pm ET) to view the livestream!
Does this all sound amazing? Want to know more about how Network to Code can help you do this, reach out to our sales team. If you want to help make this a reality for our clients, check out our careers page.
A few months ago, Network to Code released two open source projects (more info) to contribute to solving a common problem in modern networks: understand when a circuit is going through a planned maintenance. On one side, the circuit-maintenance-parser, to parse notifications into a common data structure, and on the other side, the nautobot-circuit-maintenance plugin, that uses the parser library to automatically populate the data into Nautobot. Following months of development, we are happy to announce the release of version 2.0.0 of the parser, which comes with a lot of improvements based on existing customer deployments and now covers 19 different ISPs!
circuit-maintenance-parser Library
In the mentioned blog, we acknowledged that we were not the first ones trying to solve this issue. We decided to adopt the format proposed here (using the iCalendar format) as our gold standard.
Now, you could be wondering: why do we need a parser library when there is a well-defined format? The answer is well-known… being just a recommendation it is not fully adopted by all the Network Service Providers (NSPs). So there is still a need to parse arbitrary data formats in order to obtain something conforming to a standard Maintenance containing the following attributes:
provider: identifies the provider of the service that is the subject of the maintenance notification.
account: identifies an account associated with the service that is the subject of the maintenance notification.
maintenance_id: contains text that uniquely identifies the maintenance that is the subject of the notification.
circuits: list of circuits affected by the maintenance notification and their specific impact.
status: defines the overall status or confirmation for the maintenance.
start: timestamp that defines the start date of the maintenance in GMT.
end: timestamp that defines the end date of the maintenance in GMT.
stamp: timestamp that defines the update date of the maintenance in GMT.
organizer: defines the contact information included in the original notification.
Please, refer to the BCOP to more details about these attributes.
This library aims to fill the current gap between an ideal representation of a circuit maintenance notification and the current providers’ formats, enabling automation to be built around these notifications.
The first version of the circuit-maintenance-parser had a simple workflow that eventually became a blocker to solve more complex use-cases and this required a new middleware that could combine multiple data using custom logic to process composed notifications, and be able to accommodate future new use-cases. More details about the logic workflow is available in the library Readme.
Supported Providers
Obviously, one of the key success indicators of the library is how many Providers are supported, and thanks to multiple examples seen from early adopters, the supported providers list has increased to 19 providers and it’s growing quickly.
AquaComms
AWS
Cogent
Colt
EuNetworks
GTT
HGC
Lumen
Megaport
Momentum
NTT
PacketFabric
Seaborn
Sparkle
Telia
Telstra
Turkcell
Verizon
Zayo
Moreover, the gold format is supported by default with the GenericProvider, so any NSP that sends the notification with the iCalendar format is supported by default.
How to Use It?
The circuit_maintenance_parser library requires two things:
The notificationdata: this is the data that the library will check to extract the maintenance notifications. It can be simple (only one data type and content, such as an iCalendar notification) or more complex (with multiple data parts of different types, such as from an email).
The provider identifier: used to select the required Provider. Each Provider contains the logic to process the notificationdata using associated parsers.
Python Library
First step is to define the Provider that we will use to parse the notifications. By default, the GenericProvider (used when no other provider type is defined) will support parsing of iCalendar notifications using the recommended format.
from circuit_maintenance_parser import init_providergeneric_provider = init_provider()type(generic_provider)<class 'circuit_maintenance_parser.provider.GenericProvider'>
However, usually some Providers don’t fully implement the standard. Or perhaps some information is missing, for example the organizer email. We also support custom defined Providers that can be used to tailor the data extraction based on the notifications your organization receives :
Once we have the Provider ready, we need to initialize the data to process. We call it NotificationData and can be initialized from a simple content and type or from more complex structures, such as an email.
Finally, we retrieve the maintenances (it is a List because a notification can contain multiple maintenances) from the data calling the get_maintenances method from the Provider instance:
Notice that, either with the GenericProvider or NTT provider, we get the same result from the same parsed data, because they are using exactly the same Processor and Parser. The only difference is that NTT Provider will provide some custom default values for NTT in case the notification doesn’t contain this data. In this case, the notification contains all the information, so the custom defaults for Provider are not used.
Even though the library aims to include support for as many providers as possible, it’s likely that not all the thousands of NSP are supported and you may need to add support for some new one. Adding a new Provider is quite straightforward, and in the following example we are adding support for an imaginary provider, ABCDE, that uses HTML notifications.
First step is creating a new file: circuit_maintenance_parser/parsers/abcde.py. This file will contain all the custom parsers needed for the provider and it will import the base classes for each parser type from circuit_maintenance_parser.parser. In the example, we only need to import Html and in the child class implement the methods required by the class, in this case parse_html() which will return a dict with all the data that this Parser can extract. In this case we have to helper methods, _parse_bs and _parse_tables that implement the logic to navigate the notification data.
Next step is to create the new Provider by defining a new class in circuit_maintenance_parser/provider.py. This class that inherits from GenericProvider only needs to define two attributes:
_processors: is a list of Processor instances that uses several data Parsers. In this example, we don’t need to create a new custom Processor because the combined logic serves well (the most likely case), and we only need to use the new defined HtmlParserABCDE1 and also the generic EmailDateParser that extract the email date. Also notice that you could have multiple Processors with different Parsers in this list, supporting several formats.
_default_organizer: this is a default helper to fill the organizer attribute in the Maintenance if the information is not part of the original notification.
_include_filter: mapping of data_types to a list of regex expressions that if provided have to match to parse the notification. This feature removes noise from notifications that are received from the same provider, but that are not related to circuit maintenance notifications.
_exclude_filter: antagonist mapping to define via regex which are the notifications that must not parsed.
And expose the new Provider in circuit_maintenance_parser/__init__.py:
from .provider import ( GenericProvider, ABCDE,...)SUPPORTED_PROVIDERS= (GenericProvider,ABCDE,...)
Last, but not least, you should update the tests!
Test the new Parser in tests/unit/test_parsers.py
Test the new Provider logic in tests/unit/test_e2e.py
… adding the necessary data samples in tests/unit/data/abcde/.
Conclusion
Give it a try!, as the community is growing more and more Providers are going to be added and you can benefit from all of them. Also, developing a new Provider or Parser is straightforward, and you can contribute to the library via Pull Requests or Issues, providing notifications samples to develop.
As showed in How to use it?, you can easily integrate it with any automation application only passing the notification data and selecting the Provider that should be used to parse it, and then do what you want with the structured output.
Does this all sound amazing? Want to know more about how Network to Code can help you do this, reach out to our sales team. If you want to help make this a reality for our clients, check out our careers page.
We use cookies on our website to give you the most relevant experience by remembering your preferences and repeat visits. By clicking “Accept”, you consent to the use of ALL the cookies. In case of sale of your personal information, you may opt out by using the link Do not sell my personal information. Privacy | Cookies
This website uses cookies to improve your experience while you navigate through the website. Out of these cookies, the cookies that are categorized as necessary are stored on your browser as they are essential for the working of basic functionalities of the website. We also use third-party cookies that help us analyze and understand how you use this website. These cookies will be stored in your browser only with your consent. You also have the option to opt-out of these cookies. But opting out of some of these cookies may have an effect on your browsing experience.
Necessary cookies are absolutely essential for the website to function properly. These cookies ensure basic functionalities and security features of the website, anonymously.
Cookie
Duration
Description
__hssc
30 minutes
HubSpot sets this cookie to keep track of sessions and to determine if HubSpot should increment the session number and timestamps in the __hstc cookie.
__hssrc
session
This cookie is set by Hubspot whenever it changes the session cookie. The __hssrc cookie set to 1 indicates that the user has restarted the browser, and if the cookie does not exist, it is assumed to be a new session.
cookielawinfo-checkbox-advertisement
1 year
Set by the GDPR Cookie Consent plugin, this cookie records the user consent for the cookies in the "Advertisement" category.
cookielawinfo-checkbox-analytics
11 months
This cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Analytics".
cookielawinfo-checkbox-functional
11 months
The cookie is set by GDPR cookie consent to record the user consent for the cookies in the category "Functional".
cookielawinfo-checkbox-necessary
11 months
This cookie is set by GDPR Cookie Consent plugin. The cookies is used to store the user consent for the cookies in the category "Necessary".
cookielawinfo-checkbox-others
11 months
This cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Other.
cookielawinfo-checkbox-performance
11 months
This cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Performance".
CookieLawInfoConsent
1 year
CookieYes sets this cookie to record the default button state of the corresponding category and the status of CCPA. It works only in coordination with the primary cookie.
viewed_cookie_policy
11 months
The cookie is set by the GDPR Cookie Consent plugin and is used to store whether or not user has consented to the use of cookies. It does not store any personal data.
Functional cookies help to perform certain functionalities like sharing the content of the website on social media platforms, collect feedbacks, and other third-party features.
Cookie
Duration
Description
__cf_bm
30 minutes
Cloudflare set the cookie to support Cloudflare Bot Management.
li_gc
5 months 27 days
Linkedin set this cookie for storing visitor's consent regarding using cookies for non-essential purposes.
lidc
1 day
LinkedIn sets the lidc cookie to facilitate data center selection.
UserMatchHistory
1 month
LinkedIn sets this cookie for LinkedIn Ads ID syncing.
Performance cookies are used to understand and analyze the key performance indexes of the website which helps in delivering a better user experience for the visitors.
Analytical cookies are used to understand how visitors interact with the website. These cookies help provide information on metrics the number of visitors, bounce rate, traffic source, etc.
Cookie
Duration
Description
__hstc
5 months 27 days
Hubspot set this main cookie for tracking visitors. It contains the domain, initial timestamp (first visit), last timestamp (last visit), current timestamp (this visit), and session number (increments for each subsequent session).
_ga
1 year 1 month 4 days
Google Analytics sets this cookie to calculate visitor, session and campaign data and track site usage for the site's analytics report. The cookie stores information anonymously and assigns a randomly generated number to recognise unique visitors.
_gat_gtag_UA_*
1 minute
Google Analytics sets this cookie to store a unique user ID.
_gid
1 day
Google Analytics sets this cookie to store information on how visitors use a website while also creating an analytics report of the website's performance. Some of the collected data includes the number of visitors, their source, and the pages they visit anonymously.
AnalyticsSyncHistory
1 month
Linkedin set this cookie to store information about the time a sync took place with the lms_analytics cookie.
CONSENT
2 years
YouTube sets this cookie via embedded YouTube videos and registers anonymous statistical data.
hubspotutk
5 months 27 days
HubSpot sets this cookie to keep track of the visitors to the website. This cookie is passed to HubSpot on form submission and used when deduplicating contacts.
ln_or
1 day
Linkedin sets this cookie to registers statistical data on users' behaviour on the website for internal analytics.
Advertisement cookies are used to provide visitors with relevant ads and marketing campaigns. These cookies track visitors across websites and collect information to provide customized ads.
Cookie
Duration
Description
bcookie
1 year
LinkedIn sets this cookie from LinkedIn share buttons and ad tags to recognize browser IDs.
bscookie
1 year
LinkedIn sets this cookie to store performed actions on the website.
li_sugr
3 months
LinkedIn sets this cookie to collect user behaviour data to optimise the website and make advertisements on the website more relevant.
VISITOR_INFO1_LIVE
5 months 27 days
YouTube sets this cookie to measure bandwidth, determining whether the user gets the new or old player interface.
YSC
session
Youtube sets this cookie to track the views of embedded videos on Youtube pages.
yt-remote-connected-devices
never
YouTube sets this cookie to store the user's video preferences using embedded YouTube videos.
yt-remote-device-id
never
YouTube sets this cookie to store the user's video preferences using embedded YouTube videos.
yt.innertube::nextId
never
YouTube sets this cookie to register a unique ID to store data on what videos from YouTube the user has seen.
yt.innertube::requests
never
YouTube sets this cookie to register a unique ID to store data on what videos from YouTube the user has seen.