Browse Definitions by Alphabet

  • Boomi AtomSphere Platform (Dell Boomi AtomSphere) - Dell Boomi is a Software as a Service (SaaS) integration vendor that provides its AtomSphere technology to a host of industry giants, including Salesforce.
  • daemon - In computing, a daemon (pronounced DEE-muhn) is a program that runs continuously as a background process and wakes up to handle periodic service requests, which often come from remote processes.
  • daily stand-up meeting - A daily stand-up meeting is a short organizational meeting that is held each day.
  • dark post - A dark post is an inexpensive sponsored message on a social media website that is not published to the sponsor page timeline and will not display in follower feeds organically.
  • dark web monitoring - Dark web monitoring is the process of searching for and continuously tracking information on the dark web.
  • data abstraction - Data abstraction is the reduction of a particular body of data to a simplified representation of the whole.
  • Data Access Arrangement (DAA) - A Data Access Arrangement (DAA) is an electronic interface within a computer and its modem to a public telephone line.
  • data analytics (DA) - Data analytics (DA) is the process of examining data sets to find trends and draw conclusions about the information they contain.
  • data anonymization - Data anonymization describes various techniques to remove or block data containing personally identifiable information (PII).
  • data archiving - Data archiving moves data that is no longer actively used to a separate storage device for long-term retention.
  • data at rest - Data at rest is a term that is sometimes used to refer to all data in computer storage while excluding data that is traversing a network or temporarily residing in computer memory to be read or updated.
  • data availability - Data availability is a term used by computer storage manufacturers and storage service providers to describe how data should be available at a required level of performance in situations ranging from normal through disastrous.
  • data binding - Data binding is the process that couples two data sources together and synchronizes them.
  • data breach - A data breach is a cyber attack in which sensitive, confidential or otherwise protected data has been accessed or disclosed in an unauthorized fashion.
  • data broker (information broker) - A data broker, also called an information broker or information reseller, is a business that collects large amounts of personal information about consumers.
  • data catalog - A data catalog is a software application that creates an inventory of an organization's data assets to help data professionals and business users find relevant data for analytics uses.
  • data center - A data center is a facility composed of networked computers, storage systems and computing infrastructure that organizations use to assemble, process, store and disseminate large amounts of data.
  • data center as a service (DCaaS) - Data center as a service (DCaaS) is the provision of off-site physical data center facilities and infrastructure to clients.
  • data center capacity planning - Data center capacity planning ensures that an IT organization has enough facility space, power and computing resources to support average and peak workloads.
  • Data center career path: Fast Guide - Data centers offer competitive salaries, enjoyable work and diverse opportunities for workers in the tech sector whether you want to become an entry-level data center technician or have the necessary skills to become a data center architect.
  • data center chiller - A data center chiller is a cooling system used in a data center to remove heat from one element and deposit it into another element.
  • data center infrastructure efficiency (DCiE) - Data Center Infrastructure Efficiency (DCiE) is a metric used to determine the energy efficiency of a data center.
  • data center infrastructure management (DCIM) - Data center infrastructure management (DCIM) is the convergence of IT and building facilities functions within an organization.
  • data center interconnect (DCI) - Data center interconnect (DCI) technology links two or more data centers together to share resources.
  • data center management - Data center management refers to the set of tasks and activities handled by an organization for the day-to-day requirements of operating a data center.
  • data center modernization - Data center modernization includes updating and improving a data center to meet the requirements of the current and the next generation of workloads.
  • data center resiliency - Resiliency is the ability of a server, network, storage system or an entire data center to recover quickly and continue operating even when there has been an equipment failure, power outage or other disruption.
  • data center services - Data center services provide the supporting components necessary to the proper operation of a data center.
  • data citizen - A data citizen is an employee who relies on data to make decisions and perform job responsibilities.
  • data classification - Data classification is the process of organizing data into categories that make it easy to retrieve, sort and store for future use.
  • data clean room - A data clean room is a technology service that helps content platforms keep first person user data private when interacting with advertising providers.
  • data cleansing (data cleaning, data scrubbing) - Data cleansing, also referred to as data cleaning or data scrubbing, is the process of fixing incorrect, incomplete, duplicate or otherwise erroneous data in a data set.
  • data collection - Data collection is the process of gathering data for use in business decision-making, strategic planning, research and other purposes.
  • data compliance - Data compliance is a process that identifies the applicable governance for data protection, security, storage and other activities and establishes policies, procedures and protocols ensuring data is fully protected from unauthorized access and use, malware and other cybersecurity threats.
  • data compression - Data compression is a reduction in the number of bits needed to represent data.
  • data curation - Data curation is the process of creating, organizing and maintaining data sets so they can be accessed and used by people looking for information.
  • data de-identification - Data de-identification is decoupling or masking data, to prevent certain data elements from being associated with the individual.
  • data deduplication - Data deduplication is a process that eliminates redundant copies of data and reduces storage overhead.
  • data deduplication hardware - Data deduplication hardware is disk storage that eliminates redundant copies of data and retains one instance to be stored.
  • Data Definition Language (DDL) - Data Definition Language (DDL) is used to create and modify the structure of objects in a database using predefined commands and a specific syntax.
  • data destruction - Data destruction is the process of destroying data stored on tapes, hard disks and other forms of electronic media so that it's completely unreadable and can't be accessed or used for unauthorized purposes.
  • data dictionary - A data dictionary is a collection of descriptions of the data objects or items in a data model to which programmers and others can refer.
  • data dignity - Data dignity, also known as data as labor, is a theory positing that people should be compensated for the data they have created.
  • Data Dredging (data fishing) - Data dredging -- sometimes referred to as data fishing -- is a data mining practice in which large data volumes are analyzed to find any possible relationships between them.
  • Data Dynamics StorageX - Data Dynamics StorageX is a software suite that specializes in data migration and Microsoft Distributed File System management.
  • data engineer - A data engineer is an IT professional whose primary job is to prepare data for analytical or operational uses.
  • data exploration - Data exploration is the first step in data analysis involving the use of data visualization tools and statistical techniques to uncover data set characteristics and initial patterns.
  • data feed - A data feed is an ongoing stream of structured data that provides users with updates of current information from one or more sources.
  • data governance policy - A data governance policy is a documented set of guidelines for ensuring that an organization's data and information assets are managed consistently and used properly.
  • data gravity - Data gravity is the ability of a body of data to attract applications, services and other data.
  • data historian - A data historian is a software program that records the data created by processes running in a computer system.
  • data in motion - Data in motion, also referred to as data in transit or data in flight, is a process in which digital information is transported between locations either within or between computer systems.
  • data in use - Data in use is data that is currently being updated, processed, accessed and read by a system.
  • data ingestion - Data ingestion is the process of obtaining and importing data for immediate use or storage in a database.
  • data integration - Data integration is the process of combining data from multiple source systems to create unified sets of information for both operational and analytical uses.
  • data integrity - Data integrity is the assurance that digital information is uncorrupted and can only be accessed or modified by those authorized to do so.
  • data journalism - Data journalism in an approach to writing for the public in which the journalist analyzes large data sets to identify potential news stories.
  • data lake - A data lake is a storage repository that holds a vast amount of raw data in its native format until it is needed for analytics applications.
  • data lakehouse - A data lakehouse is a data management architecture that combines the key features and the benefits of a data lake and a data warehouse.
  • data latency - Data latency is the time it takes for data packets to be stored or retrieved.
  • data lifecycle management (DLM) - Data lifecycle management (DLM) is a policy-based approach to managing the flow of an information system's data throughout its lifecycle: from creation and initial storage to when it becomes obsolete and is deleted.
  • data link layer - The data link layer is the protocol layer in a program that handles how data moves in and out of a physical link in a network.
  • data literacy - Data literacy is the ability to derive meaningful information from data, just as literacy in general is the ability to derive information from the written word.
  • data loss - Data loss is the intentional or unintentional destruction of information.
  • data management platform (DMP) - A data management platform (DMP), also referred to as a unified data management platform (UDMP), is a centralized system for collecting and analyzing large sets of data originating from disparate sources.
  • data marketplace (data market) - A data marketplace, or data market, is an online store where people can buy data.
  • data masking - Data masking is a method of creating a structurally similar but inauthentic version of an organization's data that can be used for purposes such as software testing and user training.
  • data mesh - Data mesh is a decentralized data management architecture for analytics and data science.
  • data migration - Data migration is the process of transferring data between data storage systems, data formats or computer systems.
  • data minimization - Data minimization aims to reduce the amount of collected data to only include necessary information for a specific purpose.
  • data mining - Data mining is the process of sorting through large data sets to identify patterns and relationships that can help solve business problems through data analysis.
  • data modeling - Data modeling is the process of creating a simplified visual diagram of a software system and the data elements it contains, using text and symbols to represent the data and how it flows.
  • data observability - Data observability is a process and set of practices that aim to help data teams understand the overall health of the data in their organization's IT systems.
  • data pipeline - A data pipeline is a set of network connections and processing steps that moves data from a source system to a target location and transforms it for planned business uses.
  • data plan (mobile data plan) - A data plan is an agreement between a mobile carrier and a customer that specifies how much mobile data the user can access, usually per month, for a specific fee on a carrier network.
  • data plane - The data plane -- sometimes known as the user plane, forwarding plane, carrier plane or bearer plane -- is the part of a network that carries user traffic.
  • data point - A data point is a discrete unit of information.
  • data poisoning (AI poisoning) - Data or AI poisoning attacks are deliberate attempts to manipulate the training data of artificial intelligence and machine learning models to corrupt their behavior and elicit skewed, biased or harmful outputs.
  • data portability - Data portability is the ability to move data among different application programs, computing environments or cloud services.
  • data preprocessing - Data preprocessing, a component of data preparation, describes any type of processing performed on raw data to prepare it for another data processing procedure.
  • data processing - Data processing refers to essential operations executed on raw data to transform the information into a useful format or structure that provides valuable insights to a user or organization.
  • data profiling - Data profiling refers to the process of examining, analyzing, reviewing and summarizing data sets to gain insight into the quality of data.
  • Data Protection Act 2018 (DPA 2018) - The Data Protection Act 2018 (DPA 2018) is a legislative framework in the United Kingdom governing the processing of personal data.
  • data protection as a service (DPaaS) - Data protection as a service (DPaaS) involves managed services that safeguard an organization's data.
  • data protection authorities - Data protection authorities (DPAs) are public authorities responsible for enforcing data protection laws and regulations within a specific jurisdiction.
  • data protection impact assessment (DPIA) - A data protection impact assessment (DPIA) is a process designed to help organizations determine how data processing systems, procedures or technologies affect individuals' privacy and eliminate any risks that might violate compliance.
  • data protection management (DPM) - Data protection management (DPM) is the administration, monitoring and management of backup processes to ensure backup tasks run on schedule and data is securely backed up and recoverable.
  • data quality - Data quality is a measure of a data set's condition based on factors such as accuracy, completeness, consistency, reliability and validity.
  • data recovery - Data recovery restores data that has been lost, accidentally deleted, corrupted or made inaccessible.
  • data recovery agent (DRA) - A data recovery agent (DRA) is a Microsoft Windows user account with the ability to decrypt data that was encrypted by other users.
  • data reduction - Data reduction lowers the amount of capacity required to store data.
  • data replication - Data replication is the process of copying data from one location to another.
  • data residency - Data residency refers to the physical or geographic location of an organization's data or information.
  • data restore - Data restore is the process of copying backup data from secondary storage and restoring it to its original location or a new location.
  • data retention policy - In business settings, data retention is a concept that encompasses all processes for storing and preserving data, as well as the specific time periods and policies businesses enforce that determine how and for how long data should be retained.
  • data sampling - Data sampling is a statistical analysis technique used to select, manipulate and analyze a representative subset of data points to identify patterns and trends in the larger data set being examined.
  • data science platform - A data science platform is software that allows data scientists to uncover actionable insights from data and communicate those insights throughout an enterprise within a single environment.
  • data scientist - A data scientist is an analytics professional who is responsible for collecting, analyzing and interpreting data to help drive decision-making in an organization.
  • data set - A data set, also spelled 'dataset,' is a collection of related data that's usually organized in a standardized format.
  • data source name (DSN) - A data source name (DSN) is a data structure containing information about a specific database to which an Open Database Connectivity (ODBC) driver needs to connect.