With the release of Nautobot 2.1.9 and 1.6.16 came new requirements for pynautobot to include an authentication token that for some initial calls that were not previously required. So to make sure that pynautobot (and subsequently Nautobot Ansible) and Nautobot Helm Chart work with the most recent version of Nautobot, new versions have been released.
pynautobot & Nautobot Ansible
First to check what version of pynautobot you have, you can run pip list to get that environment. Here is an example of using grep to only look for pynautobot.
❯ pip list | grep pynautobot
pynautobot 2.0.2
Nautobot 1.6 Environments
If you are continuing on the LTM release train of 1.6, your pynautobot needs to be upgraded to 1.5.2 in order to continue using the Ansible modules (4.5.0). No update to the Ansible modules is required-only the underlying pynautobot version. Complete this with:
pip install pynautobot==1.5.2
Accidental Upgrade to 2.x of pynautobot?
If you accidentally upgrade to the latest version of pynautobot but intended to be on 1.x, just issue the same command as above and you will get the right version. Nothing further would needs to be done-no harm.
pip install pynautobot=-1.5.2
Nautobot 2.1 Environments
For those with the latest Nautobot application version of 2.1.9, please upgrade the pynautobot instance in your Ansible environment to the latest of 2.1.1
pip install --upgrade pynautobot
Nautobot Helm Chart
First to check what version of Nautobot Helm Chart you have configured, you can run helm show chart nautobot/nautobot to get the full information about the configured chart. There will be multiple versions you will see in the output, the chart version that matters is the last line in the output and is a root key in the yaml output.
❯ helm show chart nautobot/nautobot
annotations:
... Truncated for bevity ...
sources:
- https://github.com/nautobot/nautobot
- https://github.com/nautobot/helm-charts
version: 2.0.5
Warning – READ BEFORE PROCEEDING
The latest version of the helm chart has a default version for Nautobot that is set to 2.1.9, if you are NOT providing custom image or statically declaring the version you WILL be upgraded to 2.1.9. For more information on using a custom image please see the documentation here or for using the Network to Code maintained images with a specific version please ensure nautobot.image.tag is set to the tagged version you are expecting to use. Below are some examples for values.yaml provided to a helm release.
If you are on a 1.X.X version of the helm chart please review the upgrade guide here before proceeding.
Before you can use the new version of the helm chart you must update the helm repo.
❯ helm repo update nautobot
Hang tight while we grab the latest from your chart repositories...
...Successfully got an update from the "nautobot" chart repository
Update Complete. ⎈Happy Helming!⎈
Update Helm Release
Now you can proceed to update your helm release with the latest helm chart version.
❯ helm upgrade <name of helm release> -f values.yml --version 2.1.0
Release "nautobot" has been upgraded. Happy Helming!
NAME: nautobot
LAST DEPLOYED: Wed Mar 27 20:09:47 2024
NAMESPACE: default
STATUS: deployed
REVISION: 3
NOTES:
*********************************************************************
*** PLEASE BE PATIENT: Nautobot may take a few minutes to install ***
*********************************************************************
... Truncated for bevity ...
Conclusion
When issues do arise on playbooks that were previously working fine, it’s best to give your dependency software packages a quick update. Hope that this helps. Happy automating.
Does this all sound amazing? Want to know more about how Network to Code can help you do this, reach out to our sales team. If you want to help make this a reality for our clients, check out our careers page.
If you’ve been watching this space, you’ve seen me talking about Nautobot’s GraphQL capabilities and how GraphQL helps you:
GraphQL queries are much more efficient than RESTful queries
GraphQL makes your life easier by making data more accessible
The above results in dramatic improvement in your quality of life
This post is a case study in those aspects. It will empirically demonstrate how GraphQL:
Minimizes your number of queries
Returns only the data you want
Makes it so you don’t have to manually filter data and build the desired data structure in a script
Creates faster automation
Reduces your workload
I will be running this case study using https://demo.nautobot.com/. Readers are encouraged to follow along, using the scripts below.
The Problem Statement
In this case study, the goal is to gather specific information for certain network elements from Nautobot. Specifically, we want a data structure with the following information:
We want information from all devices within the ams site
The data structure should organize information so that all the data is grouped on a per-device basis
We want this specific data for each device:
Device name
Device role
All the interface names
The list of IP address(es) for each interface, even if the interface has no configured IP address(es)
The GraphQL Solution
The GraphQL solution will leverage the pynautobot Python package, which provides a customized and efficient way to programmatically query Nautobot via GraphQL.
Take specific note of the following GraphQL features demonstrated above:
The returned data comes back in a structure that matches that of the query
GraphQL returns only the requested data
The returned data is ready for programmatic parsing
Running the script six times produced an average of 2.23 seconds, returning data for ten devices in the ams site.
RESTful Solution
For the RESTful solution, we’re not concerned about matching the exact data structure returned by GraphQL. We’re only concerned with getting the same data into a structure that can be parsed for programmatic use.
The GraphQL results were grouped by device, and the RESTful solution will do that as well, but will have some small format changes.
Here is the format for the data structure that the RESTful solution will return:
{<device_1_name>: {'role': <device_1_role>, 'interface_info': {<interface_1_name>: [list of ip addresses for interface_1],<interface_2_name>: [list of ip addresses for interface_2], . . . } } . . . <device_n_name>: {'role': <device_n_role>, 'interface_info': {<interface_1_name>: [list of ip addresses for interface_1],<interface_2_name>: [list of ip addresses for interface_2], . . . } }}
The format above is slightly different than that of the GraphQL results, but is still programmatically parsable.
The RESTful script that returns the data is below. When examining it, take note of the following:
We had to artificially construct the data structure, which required a non-trivial amount of work
The RESTful script requires three distinct API calls, with some calls iterated multiple times
Each API call returns WAY more information than we are interested in
Since the call to get interface data for the ams site returns so much extraneous information, Nautobot applies the default limit of 50 results per call
The limit constraint reduces the load on the Nautobot server
With the default database in https://demo.nautobot.com, the call to get all the interface data iterates six times, returning up to 50 results per call
The call to get the IP address information must iterate once for each of the ten devices in the ams site
The RESTful script is over twice as long as the GraphQL script and is much more complex
The amount of time required to construct, test, and validate the RESTful script was well over an order magnitude longer than that required for the GraphQL script (your mileage may vary!)
"""Use REST API calls to get the following info for each device in 'ams' site:- device name- device role-interfaceinfo - interfacename - ipaddress"""importjsonimportrequestsfrompprintimportpprintfromtimeimporttimestart_time = time()# LookingattheNautobotAPI:# - /api/dcim/devicesgivesyounameandrole# - /api/dcim/interfacesgivesyoualltheinterfaceinfo# - /api/addressesgetsIPaddressinfo# Definegeneralrequestcomponentspayload = {}headers ={"Content-Type":"application/json","Authorization":"Token aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa",}########################################### Define devices url, query for 'ams' site devicesams_dev_url ="https://demo.nautobot.com/api/dcim/devices?site=ams"# Query Nautobot for the ams devicesams_dev_resp = requests.get(ams_dev_url, headers=headers, data=payload)# Turn the response text string into jsonams_devices_json = ams_dev_resp.json()# Device info dictdevice_info ={}# Create a dict with device names askeys; the value for each key will be a dict.for device in ams_devices_json["results"]: role = device['device_role']['display'] dev_name = device["name"] device_info[dev_name] ={'role': role,'interface_info':{},}print("device_info is:")pprint(device_info)print()print()##########################################print("The GraphQL query returned all interfaces for a device, regardless of whether ")print("an ip address was configured; we will match that here.")print()print("Gathering interface info for `ams` site.")# Define url for device interfaces in'ams' siteams_interface_url ="https://demo.nautobot.com/api/dcim/interfaces?site=ams"# Define a list to hold the interfaceinfofor `ams` siteams_interface_info = []# Accountforams_interface_urlresultslimit; iterateurluntil 'next' urlisNonewhileams_interface_urlisnotNone:ams_interface_resp = requests.get(ams_interface_url, headers=headers, data=payload)ams_interface_json = ams_interface_resp.json()ams_interface_url = ams_interface_json["next"]print("ams_interface_urlis{}".format(ams_interface_url)) ams_interface_info.extend(ams_interface_json["results"])print()print("Adding interface names to device_info for the appropriate device.")# Filter out the interfacenamesandaddthemindevice_infoforinterface_entryinams_interface_info:dev_name = interface_entry["device"]["name"]interface_name = interface_entry["name"]device_info[dev_name]['interface_info'][interface_name] = []print()#####################################print("Finally, gathertheIPaddressinfoforeachinterface.")print("ThisRESTfulcallreturnsonlyinterfacesthathaveIPaddressesconfigured.")print()ip_info_list = []fordeviceindevice_info.keys():ip_url = "https://demo.nautobot.com/api/ipam/ip-addresses?device={}".format(device) # Accountforip_urlresultslimit; iterateurluntil 'next' urlisNonewhileip_urlisnotNone:print("ip_url = {}".format(ip_url)) ip_url_response = requests.get(ip_url, headers=headers, data=payload) ip_json = ip_url_response.json() ip_url = ip_json["next"] ip_info_list.extend(ip_json["results"])print()print("Add the IP address info to device_info.")print()for item inip_info_list: device = item["assigned_object"]["device"]["name"]interface = item["assigned_object"]["name"]address = item["address"]device_info[device]['interface_info'][interface].append(address)print("Hereisthecompleteddatastructure:")pprint(device_info)print()end_time = time()run_time = end_time - start_timeprint("Runtime = {}".format(run_time))
Here are the results of the RESTful script:
blogs/graphql_vs_restful % python3 -i restful_api_query_ams_device_ints.pydevice_info is:{'ams-edge-01': {'interface_info':{},'role':'edge'},'ams-edge-02': {'interface_info':{},'role':'edge'},'ams-leaf-01': {'interface_info':{},'role':'leaf'},'ams-leaf-02': {'interface_info':{},'role':'leaf'},'ams-leaf-03': {'interface_info':{},'role':'leaf'},'ams-leaf-04': {'interface_info':{},'role':'leaf'},'ams-leaf-05': {'interface_info':{},'role':'leaf'},'ams-leaf-06': {'interface_info':{},'role':'leaf'},'ams-leaf-07': {'interface_info':{},'role':'leaf'},'ams-leaf-08': {'interface_info':{},'role':'leaf'}}The GraphQL query returned all interfaces for a device, regardless of whether an ip address was configured; we will match that here.Gathering interfaceinfofor `ams` site.ams_interface_urlishttps://demo.nautobot.com/api/dcim/interfaces/?limit=50&offset=50&site=amsams_interface_urlishttps://demo.nautobot.com/api/dcim/interfaces/?limit=50&offset=100&site=amsams_interface_urlishttps://demo.nautobot.com/api/dcim/interfaces/?limit=50&offset=150&site=amsams_interface_urlishttps://demo.nautobot.com/api/dcim/interfaces/?limit=50&offset=200&site=amsams_interface_urlishttps://demo.nautobot.com/api/dcim/interfaces/?limit=50&offset=250&site=amsams_interface_urlishttps://demo.nautobot.com/api/dcim/interfaces/?limit=50&offset=300&site=amsams_interface_urlisNoneAddinginterfacenamestodevice_infofortheappropriatedevice.Finally, gathertheIPaddressinfoforeachinterface.ThisRESTfulcallreturnsonlyinterfacesthathaveIPaddressesconfigured.ip_url = https://demo.nautobot.com/api/ipam/ip-addresses?device=ams-edge-01ip_url = https://demo.nautobot.com/api/ipam/ip-addresses?device=ams-edge-02ip_url = https://demo.nautobot.com/api/ipam/ip-addresses?device=ams-leaf-01ip_url = https://demo.nautobot.com/api/ipam/ip-addresses?device=ams-leaf-02ip_url = https://demo.nautobot.com/api/ipam/ip-addresses?device=ams-leaf-03ip_url = https://demo.nautobot.com/api/ipam/ip-addresses?device=ams-leaf-04ip_url = https://demo.nautobot.com/api/ipam/ip-addresses?device=ams-leaf-05ip_url = https://demo.nautobot.com/api/ipam/ip-addresses?device=ams-leaf-06ip_url = https://demo.nautobot.com/api/ipam/ip-addresses?device=ams-leaf-07ip_url = https://demo.nautobot.com/api/ipam/ip-addresses?device=ams-leaf-08AddtheIPaddressinfotodevice_info.Hereisthecompleteddatastructure:{'ams-edge-01':{'interface_info':{'Ethernet1/1': ['10.11.192.0/32'],'Ethernet10/1': ['10.11.192.32/32'],'Ethernet11/1': [],'Ethernet12/1': [],< --- snipforbrevity --- >'Ethernet9/1': ['10.11.192.28/32'],'Loopback0': ['10.11.128.1/32'],'Management1': []},'role':'edge'},'ams-edge-02':{'interface_info':{'Ethernet1/1': ['10.11.192.1/32'],'Ethernet10/1': ['10.11.192.34/32'],< --- snipforbrevity --- >'Loopback0': ['10.11.128.2/32'],'Management1': []},'role':'edge'},'ams-leaf-01':{'interface_info':{'Ethernet1': ['10.11.192.5/32'],< --- snipforbrevity --- >'vlan99': ['10.11.64.0/32']},'role':'leaf'},<--- some devices snipped for brevity --->'ams-leaf-07':{'interface_info':{'Ethernet1': ['10.11.192.29/32'],'Ethernet10': [],< --- snipforbrevity --- >'vlan99': ['10.11.70.0/32']},'role':'leaf'},'ams-leaf-08':{'interface_info':{'Ethernet1': ['10.11.192.33/32'],< --- snipforbrevity --- >'vlan99': ['10.11.71.0/32']},'role':'leaf'}}Run time =13.60936713218689>>>
Running the script six times produced an average run time of 14.9 seconds.
This script created a data structure that is not identical to the structure created by GraphQL, but is similar in nature and is still parsable.
Final Results
Method
Average Run Time
# of Queries
Time to Create Script
GraphQL
2.2 seconds
1
~ 20 minutes
RESTful
14.9 seconds
17
~ 200 minutes+
NOTE: These results are based on the baseline data in the Nautobot demo sandbox. If someone has modified the database, your actual results may vary a bit.
By any measure, GraphQL is the clear choice here! GraphQL allows a much simpler script that is much more efficient than REST.
Imagine your automation task being able to run an average of 12.2 seconds faster (14.9 – 2.2 seconds) by using GraphQL.
I also don’t want to undersell the amount of time and headache required to create the RESTful script, including parsing the REST data and crafting the data structure: it was not pleasant, and we should not talk about it again. Ever.
GraphQL Considerations for Server Load
Querying with GraphQL results in much less coding and post-processing for the user and is generally much more efficient than RESTful calls that achieve the same result.
However, the load on the Nautobot server must still be considered. Depending on the data you are after and your use case, it may make sense to:
Use multiple, targeted GraphQL queries instead of a single GraphQL query with a large scope
Use RESTful queries and offload the processing from the Nautobot server, doing the post-processing on your local host
Depending on how many sites and devices you have, the example query below may put undue load on the Nautobot server:
This case study validates the clear advantages GraphQL offers: simpler and faster automation, higher query efficiency, less time swimming in extraneous data, and thus less time coding. The great part is that Nautobot delivers GraphQL capabilities that you can leverage right now. Please do give it a try.
If you have questions, you can check out these Network to Code resources for more info:
Does this all sound amazing? Want to know more about how Network to Code can help you do this, reach out to our sales team. If you want to help make this a reality for our clients, check out our careers page.
Thanks to its ability to efficiently allow the request of specific information that can span multiple resources, GraphQL is a powerful query tool. A single GraphQL API request can return information that would otherwise require literally dozens of individual REST queries and extensive data filtering, post-processing, and isolation.
A prior blog post in this series covered how to craft Nautobot GraphQL queries using the Nautobot GraphiQL interface. Recall that the GraphiQL interface is just for testing the GraphQL queries. This post will build on that knowledge, showing you how to leverage those queries to craft remote GraphQL requests for programmatic use via the open source pynautobot Python library. The pynautobot project page is here; the GitHub repository can be found here.
The pynautobot Python library is an API wrapper that allows easy interactions with Nautobot. This specific article will focus the GraphQL capabilities within the library.
The examples in this post all use the public Nautobot demo site https://demo.nautobot.com/. Readers are encouraged to follow along.
Authentication
Security tokens are typically required for programmatic access to Nautobot’s data. The following sections cover how to obtain a token and how to leverage it within Postman to craft GraphQL API calls.
Tokens
Nautobot security tokens are created in the Web UI. To view your token(s), navigate to the API Tokens page under your user profile. If there is no token present, create one or get the necessary permissions to do so.
Getting Started with Pynautobot
Installing Pynautobot
pynautobot is a Python library. From your environment install it via pip3:
This first example will walk through a pynautobot GraphQL query in a Python interpreter.
Start a Python shell and import pynautobot:
% python3Python 3.9.2 (v3.9.2:1a79785e3e, Feb 192021,09:06:10) [Clang 6.0 (clang-600.0.57)] on darwinType "help","copyright","credits" or "license" for more information.>>>>>>>>>import pynautobot>>>
Taking a quick look at Classes within pynautobot, api is the one we will want to use. The help tells us that the api Class can take url and token as parameters.
This live example will use a query from a previous post. Define query in Python using the exact text from the query used in example 3 in the GraphiQL post:
NOTE: the query text can be found in the last section of the blog, under Text Query Examples and Results
>>>dir(nb.graphql)['__class__',<< dunders snipped >>,'api','query','url']>>>>>>help(nb.graphql.query)Help on method query inmodulepynautobot.core.graphql:query(query: str, variables: Optional[Dict[str, Any]] = None) -> pynautobot.core.graphql.GraphQLRecordmethodofpynautobot.core.graphql.GraphQLQueryinstanceRunsqueryagainstNautobotGraphqlendpoint.Args:query (str): Query string to send to the APIvariables (dict): Dictionary of variables to use with the query string, defaults to NoneRaises:GraphQLException:- When the query string is invalid.TypeError:- When `query` passed in is not oftypestring. - When `variables` passed in is not a dictionary. Exception: - When unknown error is passed in, please open an issue so this can be addressed.Examples:>>>try:... response.raise_for_status()... except Exception ase:... variable = e...>>> variable>>> variable.response.json(){'errors': [{'message':'Cannot query field "nae" on type "DeviceType". Did you mean "name" or "face"?','locations': [{'line':4,'column':5}]}]}>>> variable.response.status_code400Returns:GraphQLRecord: Response of the API call
Using the structure above, create the query and explore the results.
This next example features a full script, using an example from the GraphQL Aliasing with Nautobot post in this series. You can also see the YouTube video that accompanies the post here.
The script below features a query that uses GraphQL aliasing to request device names for multiple sites in a single query. The device names in each site will be returned grouped by the site name; each device name will be alaised with an inventory_hostname key (instead of name) for use in an Ansible environment.
The query text for this script was pulled directly from the Aliasing Solves the Problem section of the referenced blog post and copied directly into the query variable.
import jsonimport pynautobotfrom pprint import pprintprint("Querying Nautobot via pynautobot.")print()url = "https://demo.nautobot.com"print("url is: {}".format(url))print()query = """query { ams_devices:devices(site:"ams") {inventory_hostname:name} sin_devices:devices(site:"sin") {inventory_hostname:name} bkk_devices:devices(site:"bkk") {inventory_hostname:name}}"""print("query is:")print(query)print()token = "aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa"nb = pynautobot.api(url, token)response = nb.graphql.query(query=query)response_data = response.jsonprint("Here is the response data in json:")pprint(response_data)print()print("Here are the bkk devices:")pprint(response_data['data']['bkk_devices'])
Here is the script’s output:
% python3 -i graphql_ams_query_pynautobot.pyQuerying Nautobot via pynautobot.url is:https://demo.nautobot.comquery is:query {ams_devices:devices(site:"ams") {inventory_hostname:name}sin_devices:devices(site:"sin") {inventory_hostname:name}bkk_devices:devices(site:"bkk") {inventory_hostname:name}}Here is the response data injson:{'data': {'ams_devices': [{'inventory_hostname':'ams-edge-01'},{'inventory_hostname':'ams-edge-02'},{'inventory_hostname':'ams-leaf-01'},{'inventory_hostname':'ams-leaf-02'},{'inventory_hostname':'ams-leaf-03'},{'inventory_hostname':'ams-leaf-04'},{'inventory_hostname':'ams-leaf-05'},{'inventory_hostname':'ams-leaf-06'},{'inventory_hostname':'ams-leaf-07'},{'inventory_hostname':'ams-leaf-08'}],'bkk_devices': [{'inventory_hostname':'bkk-edge-01'},{'inventory_hostname':'bkk-edge-02'},{'inventory_hostname':'bkk-leaf-01'},{'inventory_hostname':'bkk-leaf-02'},{'inventory_hostname':'bkk-leaf-03'},{'inventory_hostname':'bkk-leaf-04'},{'inventory_hostname':'bkk-leaf-05'},{'inventory_hostname':'bkk-leaf-06'},{'inventory_hostname':'bkk-leaf-07'},{'inventory_hostname':'bkk-leaf-08'}],'sin_devices': [{'inventory_hostname':'sin-edge-01'},{'inventory_hostname':'sin-edge-02'},{'inventory_hostname':'sin-leaf-01'},{'inventory_hostname':'sin-leaf-02'},{'inventory_hostname':'sin-leaf-03'},{'inventory_hostname':'sin-leaf-04'},{'inventory_hostname':'sin-leaf-05'},{'inventory_hostname':'sin-leaf-06'},{'inventory_hostname':'sin-leaf-07'},{'inventory_hostname':'sin-leaf-08'}]}}Here are the bkk devices:[{'inventory_hostname':'bkk-edge-01'},{'inventory_hostname':'bkk-edge-02'},{'inventory_hostname':'bkk-leaf-01'},{'inventory_hostname':'bkk-leaf-02'},{'inventory_hostname':'bkk-leaf-03'},{'inventory_hostname':'bkk-leaf-04'},{'inventory_hostname':'bkk-leaf-05'},{'inventory_hostname':'bkk-leaf-06'},{'inventory_hostname':'bkk-leaf-07'},{'inventory_hostname':'bkk-leaf-08'}]>>>
Conclusion
To fully leverage GraphQL’s efficiency, it must be used programmatically. The first post in this series demonstrated using Nautobot’s GraphiQL interface to craft GraphQL queries. This post builds on that by showing how to convert those GraphQL queries into remote requests in Python code for programmatic use.
To find out more about pynautobot, including the additional capabilities not described in this post, start with the pynautobot Github repo.
Does this all sound amazing? Want to know more about how Network to Code can help you do this, reach out to our sales team. If you want to help make this a reality for our clients, check out our careers page.
We use cookies on our website to give you the most relevant experience by remembering your preferences and repeat visits. By clicking “Accept”, you consent to the use of ALL the cookies. In case of sale of your personal information, you may opt out by using the link Do not sell my personal information. Privacy | Cookies
This website uses cookies to improve your experience while you navigate through the website. Out of these cookies, the cookies that are categorized as necessary are stored on your browser as they are essential for the working of basic functionalities of the website. We also use third-party cookies that help us analyze and understand how you use this website. These cookies will be stored in your browser only with your consent. You also have the option to opt-out of these cookies. But opting out of some of these cookies may have an effect on your browsing experience.
Necessary cookies are absolutely essential for the website to function properly. These cookies ensure basic functionalities and security features of the website, anonymously.
Cookie
Duration
Description
__hssc
30 minutes
HubSpot sets this cookie to keep track of sessions and to determine if HubSpot should increment the session number and timestamps in the __hstc cookie.
__hssrc
session
This cookie is set by Hubspot whenever it changes the session cookie. The __hssrc cookie set to 1 indicates that the user has restarted the browser, and if the cookie does not exist, it is assumed to be a new session.
cookielawinfo-checkbox-advertisement
1 year
Set by the GDPR Cookie Consent plugin, this cookie records the user consent for the cookies in the "Advertisement" category.
cookielawinfo-checkbox-analytics
11 months
This cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Analytics".
cookielawinfo-checkbox-functional
11 months
The cookie is set by GDPR cookie consent to record the user consent for the cookies in the category "Functional".
cookielawinfo-checkbox-necessary
11 months
This cookie is set by GDPR Cookie Consent plugin. The cookies is used to store the user consent for the cookies in the category "Necessary".
cookielawinfo-checkbox-others
11 months
This cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Other.
cookielawinfo-checkbox-performance
11 months
This cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Performance".
CookieLawInfoConsent
1 year
CookieYes sets this cookie to record the default button state of the corresponding category and the status of CCPA. It works only in coordination with the primary cookie.
viewed_cookie_policy
11 months
The cookie is set by the GDPR Cookie Consent plugin and is used to store whether or not user has consented to the use of cookies. It does not store any personal data.
Functional cookies help to perform certain functionalities like sharing the content of the website on social media platforms, collect feedbacks, and other third-party features.
Cookie
Duration
Description
__cf_bm
30 minutes
Cloudflare set the cookie to support Cloudflare Bot Management.
li_gc
5 months 27 days
Linkedin set this cookie for storing visitor's consent regarding using cookies for non-essential purposes.
lidc
1 day
LinkedIn sets the lidc cookie to facilitate data center selection.
UserMatchHistory
1 month
LinkedIn sets this cookie for LinkedIn Ads ID syncing.
Performance cookies are used to understand and analyze the key performance indexes of the website which helps in delivering a better user experience for the visitors.
Analytical cookies are used to understand how visitors interact with the website. These cookies help provide information on metrics the number of visitors, bounce rate, traffic source, etc.
Cookie
Duration
Description
__hstc
5 months 27 days
Hubspot set this main cookie for tracking visitors. It contains the domain, initial timestamp (first visit), last timestamp (last visit), current timestamp (this visit), and session number (increments for each subsequent session).
_ga
1 year 1 month 4 days
Google Analytics sets this cookie to calculate visitor, session and campaign data and track site usage for the site's analytics report. The cookie stores information anonymously and assigns a randomly generated number to recognise unique visitors.
_gat_gtag_UA_*
1 minute
Google Analytics sets this cookie to store a unique user ID.
_gid
1 day
Google Analytics sets this cookie to store information on how visitors use a website while also creating an analytics report of the website's performance. Some of the collected data includes the number of visitors, their source, and the pages they visit anonymously.
AnalyticsSyncHistory
1 month
Linkedin set this cookie to store information about the time a sync took place with the lms_analytics cookie.
CONSENT
2 years
YouTube sets this cookie via embedded YouTube videos and registers anonymous statistical data.
hubspotutk
5 months 27 days
HubSpot sets this cookie to keep track of the visitors to the website. This cookie is passed to HubSpot on form submission and used when deduplicating contacts.
ln_or
1 day
Linkedin sets this cookie to registers statistical data on users' behaviour on the website for internal analytics.
Advertisement cookies are used to provide visitors with relevant ads and marketing campaigns. These cookies track visitors across websites and collect information to provide customized ads.
Cookie
Duration
Description
bcookie
1 year
LinkedIn sets this cookie from LinkedIn share buttons and ad tags to recognize browser IDs.
bscookie
1 year
LinkedIn sets this cookie to store performed actions on the website.
li_sugr
3 months
LinkedIn sets this cookie to collect user behaviour data to optimise the website and make advertisements on the website more relevant.
VISITOR_INFO1_LIVE
5 months 27 days
YouTube sets this cookie to measure bandwidth, determining whether the user gets the new or old player interface.
YSC
session
Youtube sets this cookie to track the views of embedded videos on Youtube pages.
yt-remote-connected-devices
never
YouTube sets this cookie to store the user's video preferences using embedded YouTube videos.
yt-remote-device-id
never
YouTube sets this cookie to store the user's video preferences using embedded YouTube videos.
yt.innertube::nextId
never
YouTube sets this cookie to register a unique ID to store data on what videos from YouTube the user has seen.
yt.innertube::requests
never
YouTube sets this cookie to register a unique ID to store data on what videos from YouTube the user has seen.