You may not know Francis Bacon, the great English philosopher and one of the founders of the modern system of political philosophy, but you do know his phrase, “knowledge is power.” This phrase was quoted from Bacon’s Meditationes Sacrae, published in 1597 and later adopted by Thomas Jefferson on at least four occasions beginning in 1817, in connection with the establishment of a state university in Virginia. Jefferson expanded upon this phrase in a letter to George Ticknor referring to state legislatures by adding, “...the members of which do not generally possess information enough to percieve [sic] the important truths, that knolege [sic] is power, that knolege [sic] is safety, and that knolege [sic] is happiness.”

Over 200 years later, these words have never been more applicable — especially in the IT industry. In order for IT individuals to have the “power” to process an unfathomable amount of daily workloads, they need an immense amount of “knowledge” to ascertain the network’s  “important truths” e.g., what’s connected, what is the hardware or software’s current state, and who has access? With this information, IT professionals can mitigate security risks to ensure the “safety” of digital assets as well as keep all data-dependant personnel, such as human resources, finance, legal, and all members of the C-Suite “happy” with a constant flow of data needed to perform their jobs.

There it is, a nearly 500-year-old adage concerning knowledge, power, and safety in order to ensure happiness. That is the underpinning of our data-dependent lives in 2019. With today’s networks evolving into various cloud structures, sprawling to the edges, proliferating IoT-related devices and moving into virtual realms, how is it possible to gain knowledge of all the hardware devices and software systems needed to properly govern these workloads? The answer can be found in technology asset management (TAM).

TAM, formerly known as ITAM or Information technology asset management, is no longer only attaining visibility into common IT assets such as servers, workstations, laptops, and mainframes. This operational visibility has expanded to include many things outside of traditional IT such as HVAC, sensors, webcams, medical devices, etc. With all these devices now connected to the network, IT managers need additional agnostic management capabilities in regard to the type of technology device or operating system in order to ensure the network’s integrity.

How important is TAM and its ability to gather data on all connected technology assets? In a recent white paper titled, “Today’s Challenges of Device Proliferation,” a poll commissioned by Nlyte Software asked 1,516 technology asset decision-makers in organizations employing 1,000 people or more, “How important is TAM?” Ninety-six percent of respondents from the USA, UK, and France said that hardware and software technology asset control is a top five priority for their business. And as a top five priority, the data pool from which decisions are extracted, must be a trusted source. The Gartner Group has emphasised this in their belief that you need to draw all the information regarding an organization’s technology assets and their current configuration items from a single, trusted, unified, and normalized source — this is what TAM delivers. 

Why TAM Matters

All the applications and data your organization manages collectively depend on a stable and secure physical infrastructure. Whether this is located on-prem, in colocation, or “edge” facilities, managers must be certain these resources are not compromised, either intentionally by outside threats, or unintentionally.

Resources can become compromised when personnel make unplanned and/or unrecorded changes to assets. Employees may make well-intentioned modifications such as adding or removing assets including servers or blades without approval or without recording the information centrally.

Such events can then open an organization to disruption, either through critical outages or cyber-attack. All too commonly, devices can be installed that don’t meet corporate or regulatory security and safety standards. Additionally, new security threats are constantly being identified, which requires the latest firmware and software patches be applied in order to close such vulnerabilities.

Yet, many organizations do not have a comprehensive list of all their hardware, or versions of firmware and software running throughout the network. This opens up the propensity for these systems to be at greater risk of cyber-attack, and also makes daily planning and optimizing more challenging. Add in the difficulties the business may face with software asset management (SAM) on top of TAM matters, and there’s a clear case for needing to take control in order to simplify the business challenge, streamline processes, manage technology costs and employee time.

Getting Your Arms Around The Baseline

In addition to the aforementioned issues, if the network is not properly scrutinized with a high level of granularity, operating costs will begin to increase because it will become more and more difficult to obtain a clear understanding of all hardware and software pieces that are now sprawled to the computing edge. Managers will always be held accountable for all devices and software running on the network no matter where they are located. However, those managers who are savvy enough to deploy TAM will avoid many hardware and software problems with the ability to collect more in-depth information. With more data collected, these managers now have that single source of truth — for the entire network — to better manage security, compliance, and software licensing.

Additionally, a full understanding of the devices and configurations responsible for processing workloads across this diverse IT ecosystem will help applications run smoothly. A TAM solution removes many challenges that inhibit a deep-dive into the full IT ecosystem, because today’s responsible infrastructure management is no longer only about the cabling and devices neatly stacked within the racks. As stated before and worth emphasizing again, data center managers need to grasp how a fractured infrastructure, spread across physical and virtual environments, is still a unified entity that impacts all workloads and application performance. In order to properly grasp the IT infrastructure, a baseline needs to be established and — TAM is the best tool to accomplish the task.

The proper baseline must be deep and granular, and it must be “created” with the following capabilities and components in mind:

  • Agnostic to the operating system of the technology device.

  • Not limited to just desktop, network, data center assets or IoT devices, it must be gathered from all categories of the organization’s active environment.

  • Able to capture, contain, and update the current state of any technology asset and display the associated components of the technology at the time of the scan.

  • Include all technology hardware and related software regardless of what it is.

  • Connect and combine with the intelligence of other standard infrastructure tools such as active directory, directory services for ownership and location.

  • Have examined the binaries of the software installed to expose particularly difficult areas such as Oracle and IBM software.

  • Capture and serve assets in highly sensitive and secure areas.

  • Use agentless scanning and the intrinsic ability to normalize the scanned data.

  • Be organized and technically capable of interacting and connecting with any area of the infrastructure.

  • Must be able to gather information utilizing virtually any protocol or, if protocols are disabled or not available, the baseline must have alternative methodologies to collect the required information.

  • Serve any part of the business or any project with salient asset information.

In essence, the baseline must be dynamic and have the ability to view all technology assets over time and ensure the accuracy and supply deltas to illustrate the technology’s evolution. By providing seamless integration with any area of the infrastructure, TAM becomes the foundation for establishing that single trusted source of accurate information on all assets.

Who Benefits From TAM?

Simply put: the whole organization will benefit from the use of a TAM solution, and not just the IT team tasked with securing and running the technology. Nlyte’s poll published in Technology Asset Management: The Hot Take, focuses on some of these stakeholders and pinpoints their opinion as to the value of TAM.

Chief Financial Officer And Finance Team

A TAM solution helps finance, procurement, and asset managers with negotiating purchase and renewals, reconciling reality to the fixed asset system, eliminating wasted expenditures, and validating the disposition of retired assets. According to the poll:

  • 40% of those in finance suggested a lack of technology asset management experience was the biggest barrier to technology asset control.

  • 39% said the same when it comes to software asset management.

  • 37% believe that a financial audit would be the most compelling event to implement asset control across the entire technology stack — the biggest response from any role to any risk event.

  • The highest rated risk for finance teams is the cost risk from potential downtime from unreliable technology assets.

Chief Security Officer And Security Team

A TAM solution identifies all things attached to the organization’s network, providing detailed information about the location, configuration, and accessibility monitoring for unplanned changes, unauthorized access, vulnerable software, lost and unresponsive assets.

  • TAM is a top five priority for 51% of the security team, but fewer claimed it as a top priority (41%).

  • A lack of technology asset management experience and a lack of hardware skills were the co-equal top barriers to technology asset control at 35%.

  • The most compelling events that would cause the team to implement asset control across the technology stack are a cybersecurity breach (for 30%) or a regulatory audit (such as SOX, GDPR, HIPAA, etc., at 19%).

Chief Compliance Officer And Compliance Team

A TAM solution provides information on asset vulnerability, accessibility, system changes out of compliance configurations, and workflow audit tracking for GDPR, HIPAA, SOX, PCI, ISO audit reporting and ongoing compliance.

  • For 67% of the compliance team, TAM is a top five priority, but fewer claimed it as a top priority (29%).

  • When it comes to the barriers to TAM there are two that stand out: Budget constraints (48%) and lack of technology asset management experience (38%).

  • The most compelling event for the IT team to get their technology assets under control would be a cybersecurity breach (26%), and co-equal in second place, a financial audit or merger/acquisition (19%).

Chief Information Officer And IT Operations

A TAM solution delivers a self-aware IT infrastructure by updating CMDB, DCIM, and BMS systems with current asset configuration and location, improving efficiency and SLA on change management workflows and to helpdesk tickets.

  • The IT team was marked as the primary responsible party for tracking assets, according to 66% of our enterprise decision makers.

  • To the IT team, the biggest barrier to TAM success is budget constraints (41%).

  • The most compelling event for the IT team to get their “house in order” is a cybersecurity breach (33%).

Conclusion

Understanding of technology asset management is high, but adoption and discipline needs to be higher across the spectrum for large organizations. As organizations grow they often lose control over their technology assets. Gaining this control back does not need to be the laborious process that it once was.

Gone are the days of spreadsheets, counting assets with a clipboard, and keeping a watch on paperwork, cross-referencing dates with a calendar to ensure patches, maintenance, and licensing updates happened. Indeed, there is no need for expensive and unreliable hand-scanners and RFID systems either, as previous management solutions required. A lightweight, agentless technology-based solution that isn’t limited to certain vendor platforms or industry protocols, removes many headaches. Such a solution allows for the discovery of any and all types of technology assets, compute, storage, network, software, firmware, IoT, and more.

A modern TAM solution establishes a technology asset baseline of everything attached to the network. From this baseline, all users have the ability to see at a glance, via a user-friendly dashboard or tailored reports, just what changes are occuring, when and by whom. It removes the headaches and labor intensive audits for the discovery, inventory, and entitlement reconciliation of all technology assets.

What’s more, one of the great leaps forward in viability and transparency is the integration that comes standard with modern solutions, that allows data to be shared with other business systems such as: ERP, fixed asset systems; HR, security; data center infrastructure management (DCIM); IT infrastructure library (ITIL); IT service management (ITSM); configuration management database (CMDB); building management systems (BMS), and others. Linking assets, locations, usage, and people will reduce hundreds of hours of inventory, audit, and compliance activity — allowing any organization more time to focus on delivering better business value while gaining more time for innovation.