In computing, CHKDSK (short for "check disk") is a system tool and command in DOS , Digital Research FlexOS , IBM / Toshiba 4690 OS , IBM OS/2 , Microsoft Windows and related operating systems . It verifies the file system integrity of a volume and attempts to fix logical file system errors. It is similar to the fsck command in Unix and similar to Microsoft ScanDisk , which co-existed with CHKDSK in Windows 9x and MS-DOS 6.x.
122-650: An early implementation of a 'CheckDisk' was the CHECKDSK that was a part of Digital Equipment Corporation hardware's diagnostics, running on early 1970s TENEX and TOPS-20 . The CHKDSK command was first implemented in 1980 by Tim Paterson and included in Seattle Computer Products 86-DOS . The command is available in MS-DOS versions 1 and later. CHKDSK is implemented as an external command. MS-DOS versions 2.x - 4.x use chkdsk.com as
244-426: A 19-inch rack . The backplanes allowed 25 modules in a single 5-1/4 inch section of rack, and allowed the high densities needed to build a computer. The original laboratory and system module lines were offered in 500 kilocycle, 5 megacycle and 10 megacycle versions. In all cases, the supply voltages were -15 and +10 volts, with logic levels of -3 volts (passive pull-down) and 0 volts (active pull-up). DEC used
366-439: A backup rotation scheme , which is a system of backing up data to computer media that limits the number of backups of different dates retained separately, by appropriate re-use of the data storage media by overwriting of backups no longer needed. The scheme determines how and when each piece of removable storage is used for a backup operation and how long it is retained once it has backup data stored on it. The 3-2-1 rule can aid in
488-419: A disk array (maybe connected to SAN ) is an example of an online backup. This type of storage is convenient and speedy, but is vulnerable to being deleted or overwritten, either by accident, by malevolent action, or in the wake of a data-deleting virus payload. Nearline storage is typically less accessible and less expensive than online storage, but still useful for backup data storage. A mechanical device
610-610: A VAX CPU was the VAX-11/780 , announced in October 1977, which DEC referred to as a superminicomputer . Although it was not the first 32-bit minicomputer, the VAX-11/780's combination of features, price, and marketing almost immediately propelled it to a leadership position in the market after it was released in 1978. VAX systems were so successful that in 1983, DEC canceled its Jupiter project , which had been intended to build
732-418: A balance between accessibility, security and cost. These media management methods are not mutually exclusive and are frequently combined to meet the user's needs. Using on-line disks for staging data before it is sent to a near-line tape library is a common example. Online backup storage is typically the most accessible type of data storage, and can begin a restore in milliseconds. An internal hard disk or
854-404: A cable. Because the data is not accessible via any computer except during limited periods in which they are written or read back, they are largely immune to on-line backup failure modes. Access time varies depending on whether the media are on-site or off-site. Backup media may be sent to an off-site vault to protect against a disaster or other site-specific problem. The vault can be as simple as
976-469: A corrupted file that is unusable. This is also the case across interrelated files, as may be found in a conventional database or in applications such as Microsoft Exchange Server . The term fuzzy backup can be used to describe a backup of live data that looks like it ran correctly, but does not represent the state of the data at a single point in time. Backup options for data files that cannot be or are not quiesced include: Not all information stored on
1098-400: A data security risk if they are lost or stolen. Encrypting the data on these media can mitigate this problem, however encryption is a CPU intensive process that can slow down backup speeds, and the security of the encrypted backups is only as effective as the security of the key management policy. When there are many more computers to be backed up than there are destination storage devices,
1220-509: A directory under the partition's root, named found.000 , and renamed into generic hexadecimally numbered files and directories starting with file00000000.chk and dir_00000000.chk respectively. On Windows NT family, a standard CHKDSK scan consists of three phases of testing file metadata. It looks for errors but does not fix them unless it is explicitly ordered to do so. The same applies to surface scan—this test, which could be extremely time-consuming on large or low-performance disks,
1342-444: A fault of the drive typically just halts the spinning. Optical media is modular ; the storage controller is not tied to media itself like with hard drives or flash storage (→ flash memory controller ), allowing it to be removed and accessed through a different drive. However, recordable media may degrade earlier under long-term exposure to light. Some optical storage systems allow for cataloged data backups without human contact with
SECTION 10
#17327811019871464-409: A layer of data protection. However, the users must trust the provider to maintain the privacy and integrity of their data, with confidentiality enhanced by the use of encryption . Because speed and availability are limited by a user's online connection, users with large amounts of data may need to use cloud seeding and large-scale recovery. Various methods can be used to manage backup media, striking
1586-407: A limited period of time, so an offsite copy still remains as the ideal choice. Because there is no perfect storage, many backup experts recommend maintaining a second copy on a local physical device, even if the data is also backed up offsite. An unstructured repository may simply be a stack of tapes, DVD-Rs or external HDDs with minimal information about what was backed up and when. This method
1708-583: A new virtual memory system, and would also improve performance by processing twice as much data at a time. The system would, however, maintain compatibility with the PDP-11, by operating in a second mode that sent its 16-bit words into the 32-bit internals, while mapping the PDP-11's 16-bit memory space into the larger virtual 32-bit space. The result was the VAX architecture, where VAX stands for Virtual Address eXtension (from 16 to 32 bits). The first computer to use
1830-476: A new device to be added easily, generally only requiring plugging a hardware interface board into the backplane and possibly adding a jumper to the wire wrapped backplane, and then installing software that read and wrote to the mapped memory to control it. The relative ease of interfacing spawned a huge market of third party add-ons for the PDP-11, which made the machine even more useful. The combination of architectural innovations proved superior to competitors and
1952-467: A profit at the end of its first year. The original Laboratory Modules were soon supplemented with the "Digital System Module " line, which were identical internally but packaged differently. The Systems Modules were designed with all of the connections at the back of the module using 22-pin Amphenol connectors, and were attached to each other by plugging them into a backplane that could be mounted in
2074-438: A selection of System Building Blocks to implement a small 12-bit machine, and attached it to a variety of analog-to-digital (A to D) input/output (I/O) devices that made it easy to interface with various analog lab equipment. The LINC proved to attract intense interest in the scientific community, and has since been referred to as the first real minicomputer , a machine that was small and inexpensive enough to be dedicated to
2196-485: A self-sustaining business, the company would be free to use them to develop a complete computer in their Phase II. The newly christened "Digital Equipment Corporation" received $ 70,000 from AR&D for a 70% share of the company, and began operations in a Civil War -era textile mill in Maynard, Massachusetts , where plenty of inexpensive manufacturing space was available. In early 1958, DEC shipped its first products,
2318-511: A separate input/output processor for further performance gains. Over 400 PDP-15's were ordered in the first eight months of production, and production eventually amounted to 790 examples in 12 basic models. However, by this time other machines in DEC's lineup could fill the same niche at even lower price points, and the PDP-15 would be the last of the 18-bit series. In 1962, Lincoln Laboratory used
2440-564: A series of machines known as the PDP line, with the PDP-8 and PDP-11 being among the most successful minis in history. Their success was only surpassed by another DEC product, the late-1970s VAX "supermini" systems that were designed to replace the PDP-11. Although a number of competitors had successfully competed with Digital through the 1970s, the VAX cemented the company's place as a leading vendor in
2562-503: A shock-absorbing case around the hard disk, and claim a range of higher drop specifications. Over a period of years the stability of hard disk backups is shorter than that of tape backups. External hard disks can be connected via local interfaces like SCSI , USB , FireWire , or eSATA , or via longer-distance technologies like Ethernet , iSCSI , or Fibre Channel . Some disk-based backup systems, via Virtual Tape Libraries or otherwise, support data deduplication, which can reduce
SECTION 20
#17327811019872684-485: A single large mainframe case, with a hexagonal control panel containing switches and lights mounted to lie at table-top height at one end of the mainframe. Above the control panel was the system's standard input/output solution, a punched tape reader and writer. Most systems were purchased with two peripherals , the Type 30 vector graphics display, and a Soroban Engineering modified IBM Model B Electric typewriter that
2806-659: A single task even in a small lab. Seeing the success of the LINC, in 1963 DEC took the basic logic design but stripped away the extensive A to D systems to produce the PDP-5 . The new machine, the first outside the PDP-1 mould, was introduced at WESTCON on August 11, 1963. A 1964 ad expressed the main advantage of the PDP-5, "Now you can own the PDP-5 computer for what a core memory alone used to cost: $ 27,000". 116 PDP-5s were produced until
2928-485: A standard configuration to many systems rather than as a tool for making ongoing backups of diverse systems. An incremental backup stores data changed since a reference point in time. Duplicate copies of unchanged data are not copied. Typically a full backup of all files is made once or at infrequent intervals, serving as the reference point for an incremental repository. Subsequently, a number of incremental backups are made after successive time periods. Restores begin with
3050-604: A successor to the PDP-10 mainframe, and instead focused on promoting the VAX as the single computer architecture for the company. Supporting the VAX's success was the VT52 , one of the most successful smart terminals . Building on earlier less successful models, the VT05 and VT50 , the VT52 was the first terminal that did everything one might want in a single inexpensive chassis. The VT52
3172-406: A system administrator's home office or as sophisticated as a disaster-hardened, temperature-controlled, high-security bunker with facilities for backup media storage. A data replica can be off-site but also on-line (e.g., an off-site RAID mirror). A backup site or disaster recovery center is used to store data that can enable computer systems and networks to be restored and properly configured in
3294-440: A turn to use the stripped-down TX-0, while largely ignoring a faster IBM machine that was also available. The two decided that the draw of interactive computing was so strong that they felt there was a market for a small machine dedicated to this role, essentially a commercialized TX-0. They could sell this to users where the graphical output or real-time operation would be more important than outright performance. Additionally, as
3416-753: Is " back up ", whereas the noun and adjective form is " backup ". Backups can be used to recover data after its loss from data deletion or corruption , or to recover data from an earlier time. Backups provide a simple form of IT disaster recovery ; however not all backup systems are able to reconstitute a computer system or other complex configuration such as a computer cluster , active directory server, or database server . A backup system contains at least one copy of all data considered worth saving. The data storage requirements can be large. An information repository model may be used to provide structure to this storage. There are different types of data storage devices used for copying backups of data that
3538-506: Is already in secondary storage onto archive files . There are also different ways these devices can be arranged to provide geographic dispersion, data security , and portability . Data is selected, extracted, and manipulated for storage. The process can include methods for dealing with live data , including open files, as well as compression, encryption, and de-duplication . Additional techniques apply to enterprise client-server backup . Backup schemes may include dry runs that validate
3660-412: Is an appended ".bak" extension to the file name . A Reverse incremental backup method stores a recent archive file "mirror" of the source data and a series of differences between the "mirror" in its current state and its previous states. A reverse incremental backup method starts with a non-image full backup. After the full backup is performed, the system periodically synchronizes the full backup with
3782-677: Is most famous as the machine for which the Unix operating system was originally written. Unix ran only on DEC systems until the Interdata 8/32 . A more dramatic upgrade to the PDP-1 series was introduced in August 1966, the PDP-9 . The PDP-9 was instruction-compatible with the PDP-4 and −7, but ran about twice as fast as the −7 and was intended to be used in larger deployments. At only $ 19,900 in 1968,
CHKDSK - Misplaced Pages Continue
3904-522: Is no reason for any individual to have a computer in his home." Unsurprisingly, DEC did not put much effort into the microcomputer area in the early days of the market. In 1977, the Heathkit H11 was announced; a PDP-11 in kit form. At the beginning of the 1980s, DEC built the VT180 (codenamed "Robin"), which was a VT100 terminal with an added Z80 -based microcomputer running CP/M , but this product
4026-567: Is not carried out unless explicitly requested. CHKDSK requires exclusive write access to the volume to perform repairs. Due to the requirement of the monopolized access to the drive, the CHKDSK cannot check the system disk in the normal system mode. Instead, the system sets a dirty bit to the disk volume and then reboots the computer. During the Windows start-up , a special version of CHKDSK called Autochk (a native mode application)
4148-507: Is online, Bott concluded "it's arguably a feature, not a bug, and the likelihood that you'll ever crash a system this way is very, very small and completely avoidable." DR DOS 6.0 also includes an implementation of the CHKDSK command. The FreeDOS version was developed by Imre Leber and is licensed under the GNU GPL 2 . The ReactOS implementation is based on a free clone developed by Mark Russinovich for Sysinternals in 1998. It
4270-474: Is required, Action Center notifies the user to take the volume offline at the first convenience. Windows Vista and Windows Server 2008 added self-healing ability, turned on by default, in addition to providing the CHKDSK command. It detects physical file system errors and silently fixes them on the fly. Thus, many problems previously discovered on running CHKDSK never appear. It is administered by fsutil repair command. Criticism has been aimed at
4392-449: Is started by the SMSS.EXE and checks and attempts repairing the file system if the dirty bit is set. Because of the exclusive access requirement and the time-consuming nature of CHKDSK operation, Windows Vista implemented a new file system health model in which the operating system fixes errors on the volumes as it encounters them. In the event that the problem is grave and a full scan
4514-448: Is stored in discrete units, known as files . These files are organized into filesystems . Deciding what to back up at any given time involves tradeoffs. By backing up too much redundant data, the information repository will fill up too quickly. Backing up an insufficient amount of data can eventually lead to the loss of critical information. Files that are actively being updated present a challenge to back up. One way to back up live data
4636-549: Is the IBM 3592 (also referred to as the TS11xx series). The Oracle StorageTek T10000 was discontinued in 2016. The use of hard disk storage has increased over time as it has become progressively cheaper. Hard disks are usually easy to use, widely available, and can be accessed quickly. However, hard disk backups are close-tolerance mechanical devices and may be more easily damaged than tapes, especially while being transported. In
4758-412: Is the easiest to implement, but unlikely to achieve a high level of recoverability as it lacks automation. A repository using this backup method contains complete source data copies taken at one or more specific points in time. Copying system images , this method is frequently used by computer technicians to record known good configurations. However, imaging is generally more useful as a way of deploying
4880-426: Is to temporarily quiesce them (e.g., close all files), take a "snapshot", and then resume live operations. At this point the snapshot can be backed up through normal methods. A snapshot is an instantaneous function of some filesystems that presents a copy of the filesystem as if it were frozen at a specific point in time, often by a copy-on-write mechanism. Snapshotting a file while it is being changed results in
5002-404: Is usually used to move media units from storage into a drive where the data can be read or written. Generally it has safety properties similar to on-line storage. An example is a tape library with restore times ranging from seconds to a few minutes. Off-line storage requires some direct action to provide access to the storage media: for example, inserting a tape into a tape drive or plugging in
CHKDSK - Misplaced Pages Continue
5124-727: The SAGE system for the US Air Force , which used large screens and light guns to allow operators to interact with radar data stored in the computer. When the Air Force project wound down, the Lab turned their attention to an effort to build a version of the Whirlwind using transistors in place of vacuum tubes . In order to test their new circuitry, they first built a small 18-bit machine known as TX-0 , which first ran in 1956. When
5246-416: The computer industry from the 1960s to the 1990s. The company was co-founded by Ken Olsen and Harlan Anderson in 1957. Olsen was president until he was forced to resign in 1992, after the company had gone into precipitous decline. The company produced many different product lines over its history. It is best known for the work in the minicomputer market starting in the early 1960s. The company produced
5368-612: The file allocation table of a disk uses 256 sectors, running CHKDSK /F can cause data loss and running UNDELETE can cause unpredictable results. This normally affects disks with a capacity of approximately a multiple of 128 MB. This applies to CHKDSK.EXE and UNDELETE.EXE bearing a datestamp of April 9, 1991. This bug was fixed in MS-DOS 5.0a. CHKDSK can be run from DOS prompt , Windows Explorer , Windows Command Prompt , Windows PowerShell or Recovery Console . On Windows NT operating systems, CHKDSK can also check
5490-802: The "11" architecture was soon the industry leader, propelling DEC back to a strong market position. The design was later expanded to allow paged physical memory and memory protection features, useful for multitasking and time-sharing . Some models supported separate instruction and data spaces for an effective virtual address size of 128 KB within a physical address size of up to 4 MB. Smaller PDP-11s, implemented as single-chip CPUs, continued to be produced until 1996, by which time over 600,000 had been sold. The PDP-11 supported several operating systems, including Bell Labs ' new Unix operating system as well as DEC's DOS-11 , RSX-11 , IAS, RT-11 , DSM-11, and RSTS/E . Many early PDP-11 applications were developed using standalone paper-tape utilities. DOS-11
5612-543: The "Digital Laboratory Module" line. The Modules consisted of a number of individual electronic components and germanium transistors mounted to a circuit board , the actual circuits being based on those from the TX-2. The Laboratory Modules were packaged in an extruded aluminum housing, intended to sit on an engineer's workbench, although a rack-mount bay was sold that held nine laboratory modules. They were then connected together using banana plug patch cords inserted at
5734-439: The "sandbox" for a rising generation of engineers and computer scientists. Large numbers of PDP-11/70s were deployed in telecommunications and industrial control applications. AT&T Corporation became DEC's largest customer. RT-11 provided a practical real-time operating system in minimal memory, allowing the PDP-11 to continue DEC's critical role as a computer supplier for embedded systems . Historically, RT-11 also served as
5856-433: The 1950s, wiped out when new technical developments rendered their platforms obsolete, and even large companies like RCA and General Electric were failing to make a profit in the market. The only serious expression of interest came from Georges Doriot and his American Research and Development Corporation (AR&D). Worried that a new computer company would find it difficult to arrange further financing, Doriot suggested
5978-572: The 1980s, culminating in the NVAX microprocessor implementation and VAX 7000/10000 series in the early 1990s. When a DEC research group demonstrated two prototype microcomputers in 1974—before the debut of the MITS Altair —Olsen chose to not proceed with the project. The company similarly rejected another personal computer proposal in 1977. At the time these systems were of limited utility, and Olsen famously derided them in 1977, stating "There
6100-450: The CPU which allowed one to easily see the logic modules plugged into the wire-wrapped backplane of the CPU. Sold standard with 4 kWords of 12-bit core memory and a Teletype Model 33 ASR for basic input/output, the machine listed for only $ 18,000. The PDP-8 is referred to as the first real minicomputer because of its sub-$ 25,000 price. Sales were, unsurprisingly, very strong, and helped by
6222-517: The PC, but was more expensive than, and completely incompatible with IBM PC hardware and software, offering far fewer options for customizing a system. Unlike CP/M and DOS microcomputers, every copy of every program for the Professional had to be provided with a unique key for the particular machine and CPU for which it was bought. At that time this was mainstream policy, because most computer software
SECTION 50
#17327811019876344-630: The PDP-8, all in software. Although not a huge seller, 142 LINC-8s were sold starting at $ 38,500. Like the original LINC to PDP-5 evolution, the LINC-8 was then modified into the single-processor PDP-12 , adding another 1000 machines to the 12-bit family. Newer circuitry designs led to the PDP-8/I and PDP-8/L in 1968. In 1975, one year after an agreement between DEC and Intersil , the Intersil 6100 chip
6466-459: The PDP-9 was a big seller, eventually selling 445 machines, more than all of the earlier models combined. Even while the PDP-9 was being introduced, its replacement was being designed, and was introduced as 1969's PDP-15 , which re-implemented the PDP-9 using integrated circuits in place of modules. Much faster than the PDP-9 even in basic form, the PDP-15 also included a floating point unit and
6588-700: The Professional was a superior machine, running inferior software. In addition, a new user would have to learn an awkward, slow, and inflexible menu-based user interface which appeared to be radically different from PC DOS or CP/M , which were more commonly used on the 8080- and 8088-based microcomputers of the time. A second offering, the DECmate II was the latest version of the PDP-8-based word processors, but not really suited to general computing, nor competitive with Wang Laboratories ' popular word processing equipment. The most popular early DEC microcomputer
6710-511: The Rainbow, and in its standard form was the first widely marketed diskless workstation . In 1984, DEC launched its first 10 Mbit/s Ethernet . Ethernet allowed scalable networking, and VAXcluster allowed scalable computing. Combined with DECnet and Ethernet-based terminal servers ( LAT ), DEC had produced a networked storage architecture which allowed them to compete directly with IBM. Ethernet replaced Token Ring , and went on to become
6832-492: The System Modules to build their "Memory Test" machine for testing core memory systems, selling about 50 of these pre-packaged units over the next eight years. The PDP-1 and LINC computers were also built using System Modules (see below). Modules were part of DEC's product line into the 1970s, although they went through several evolutions during this time as technology changed. The same circuits were then packaged as
6954-469: The TX-0 successfully proved the basic concepts, attention turned to a much larger system, the 36-bit TX-2 with a then-enormous 64 kWords of core memory . Core was so expensive that parts of TX-0's memory were stripped for the TX-2, and what remained of the TX-0 was then given to MIT on permanent loan. At MIT, Ken Olsen and Harlan Anderson noticed something odd: students would line up for hours to get
7076-436: The ability to address more memory, often by extending the address format to 18 or 24-bits in machines were otherwise similar to their earlier 16-bit designs. In contrast, DEC decided to make a more radical departure. In 1976, they began the design of a machine whose entire architecture was expanded from the 16-bit PDP-11 to a new 32-bit basis. This would allow the addressing of very large memories, which were to be controlled by
7198-437: The ability to use a single storage device with several simultaneous backups can be useful. However cramming the scheduled backup window via "multiplexed backup" is only used for tape destinations. The process of rearranging the sets of backups in an archive file is known as refactoring. For example, if a backup system uses a single tape each day to store the incremental backups for all the protected computers, restoring one of
7320-484: The accumulated changes in data) increases, so does the time to perform the differential backup. Restoring an entire system requires starting from the most recent full backup and then applying just the last differential backup. A differential backup copies files that have been created or changed since the last full backup, regardless of whether any other differential backups have been made since, whereas an incremental backup copies files that have been created or changed since
7442-463: The adoption of "\" for pathnames in MS-DOS and Microsoft Windows as opposed to "/" in Unix . The evolution of the PDP-11 followed earlier systems, eventually including a single-user deskside personal computer form, the MicroPDP-11. In total, around 600,000 PDP-11s of all models were sold, and a wide variety of third-party peripheral vendors had also entered the computer product ecosystem. It
SECTION 60
#17327811019877564-417: The alleged leak's significance. Steven Sinofsky of Microsoft also responded that Microsoft could not reproduce a crash either but that the massive memory consumption was by design, to improve performance, and not a leak. Ed Bott of ZDNet also reviewed the claim with his own tests and observed that no crash would occur. Noting that chkdsk /r , by design, does not work on the system drive while Windows
7686-510: The amount of disk storage capacity consumed by daily and weekly backup data. Optical storage uses lasers to store and retrieve data. Recordable CDs , DVDs, and Blu-ray Discs are commonly used with personal computers and are generally cheap. The capacities and speeds of these discs have typically been lower than hard disks or tapes. Advances in optical media may shrink that gap in the future. Potential future data losses caused by gradual media degradation can be predicted by measuring
7808-799: The backup process. It states that there should be at least 3 copies of the data, stored on 2 different types of storage media, and one copy should be kept offsite, in a remote location (this can include cloud storage ). 2 or more different media should be used to eliminate data loss due to similar reasons (for example, optical discs may tolerate being underwater while LTO tapes may not, and SSDs cannot fail due to head crashes or damaged spindle motors since they do not have any moving parts, unlike hard drives). An offsite copy protects against fire, theft of physical media (such as tapes or discs) and natural disasters like floods and earthquakes. Physically protected hard drives are an alternative to an offsite copy, but they have limitations like only being able to resist fire for
7930-512: The basis for the new design, although when they first viewed the proposal, management was not impressed and almost cancelled it. The result was the PDP-11 , released in 1970. It differed from earlier designs considerably. In particular, the new design did not include many of the addressing modes that were intended to make programs smaller in memory, a technique that was widely used on other DEC machines and CISC designs in general. This would mean
8052-451: The better-established vendors like IBM or Honeywell , in spite of its low cost around $ 300,000. Only 23 were sold, or 26 depending on the source, and unlike other models the low sales meant the PDP-6 was not improved with successor versions. However, the PDP-6 is historically important as the platform that introduced "Monitor", an early time-sharing operating system that would evolve into
8174-774: The company's first computer, the PDP-1 . In keeping with Doriot's instructions, the name was an initialism for " Programmable Data Processor ", leaving off the term "computer". As Gurley put it, "We aren't building computers, we're building 'Programmable Data Processors'." The prototype was first shown publicly at the Joint Computer Conference in Boston in December 1959. The first PDP-1 was delivered to Bolt, Beranek and Newman in November 1960, and formally accepted
8296-411: The compatible DECSYSTEM-20 , along with a TOPS-20 operating system that included virtual memory support. The Jupiter Project was supposed to continue the mainframe product line into the future by using gate arrays with an innovative Air Mover Cooling System, coupled with a built-in floating point processing engine called "FBOX". The design was intended for a top tier scientific computing niche, yet
8418-680: The computer is stored in files. Accurately recovering a complete system from scratch requires keeping track of this non-file data too. It is frequently useful or required to manipulate the data being backed up to optimize the backup process. These manipulations can improve backup speed, restore speed, data security, media usage and/or reduced bandwidth requirements. Out-of-date data can be automatically deleted, but for personal backup applications—as opposed to enterprise client-server backup applications where automated data "grooming" can be customized—the deletion can at most be globally delayed or be disabled. Various schemes can be employed to shrink
8540-436: The computer space. As microcomputers improved in the late 1980s, especially with the introduction of RISC -based workstation machines, the performance niche of the minicomputer was rapidly eroded. By the early 1990s, the company was in turmoil as their mini sales collapsed and their attempts to address this by entering the high-end market with machines like the VAX 9000 were market failures. After several attempts to enter
8662-457: The computers could require many tapes. Refactoring could be used to consolidate all the backups for a single computer onto a single tape, creating a "synthetic full backup". This is especially useful for backup systems that do incrementals forever style backups. Sometimes backups are copied to a staging disk before being copied to tape. This process is sometimes referred to as D2D2T, an acronym for Disk-to-disk-to-tape . It can be useful if there
8784-649: The consistency of live data, protecting self-consistent files but requiring applications "be quiesced and made ready for backup." Near-CDP is more practicable for ordinary personal backup applications, as opposed to true CDP, which must be run in conjunction with a virtual machine or equivalent and is therefore generally used in enterprise client-server backups. Software may create copies of individual files such as written documents, multimedia projects, or user preferences, to prevent failed write events caused by power outages, operating system crashes, or exhausted disk space, from causing data loss. A common implementation
8906-486: The critical performance measurement was based upon COBOL compilation which did not fully utilize the primary design features of Jupiter technology. When the Jupiter Project was cancelled in 1983, some of the engineers adapted aspects of the 36-bit design into a forthcoming 32-bit design, releasing the high-end VAX8600 in 1985. DEC's successful entry into the computer market took place during a fundamental shift in
9028-408: The data frozen at a particular point in time . Near-CDP (except for Apple Time Machine ) intent-logs every change on the host system, often by saving byte or block-level differences rather than file-level differences. This backup method differs from simple disk mirroring in that it enables a roll-back of the log and thus a restoration of old images of data. Intent-logging allows precautions for
9150-410: The data has to be copied onto an archive file data storage medium. The medium used is also referred to as the type of backup destination. Magnetic tape was for a long time the most commonly used medium for bulk data storage, backup, archiving, and interchange. It was previously a less expensive option, but this is no longer the case for smaller amounts of data. Tape is a sequential access medium, so
9272-477: The data. This allows restoration of data to any point in time and is the most comprehensive and advanced data protection. Near-CDP backup applications—often marketed as "CDP"—automatically take incremental backups at a specific interval, for example every 15 minutes, one hour, or 24 hours. They can therefore only allow restores to an interval boundary. Near-CDP backup applications use journaling and are typically based on periodic "snapshots", read-only copies of
9394-1313: The discs, allowing for longer data integrity. A French study in 2008 indicated that the lifespan of typically-sold CD-Rs was 2–10 years, but one manufacturer later estimated the longevity of its CD-Rs with a gold-sputtered layer to be as high as 100 years. Sony's proprietary Optical Disc Archive can in 2016 reach a read rate of 250 MB/s. Solid-state drives (SSDs) use integrated circuit assemblies to store data. Flash memory , thumb drives , USB flash drives , CompactFlash , SmartMedia , Memory Sticks , and Secure Digital card devices are relatively expensive for their low capacity, but convenient for backing up relatively low data volumes. A solid-state drive does not contain any movable parts, making it less susceptible to physical damage, and can have huge throughput of around 500 Mbit/s up to 6 Gbit/s. Available SSDs have become more capacious and cheaper. Flash memory backups are stable for fewer years than hard disk backups. Remote backup services or cloud backups involve service providers storing data offsite. This has been used to protect against events such as fires, floods, or earthquakes which could destroy locally stored backups. Cloud-based backup (through services like or similar to Google Drive , and Microsoft OneDrive ) provides
9516-500: The disk surface for bad sectors and mark them (in MS-DOS 6.x and Windows 9x , this is a task done by Microsoft ScanDisk ). The Windows Server version of CHKDSK is RAID -aware and can fully recover data in bad sectors of a disk in a RAID-1 or RAID-5 array if other disks in the set are intact. Fragments of files and directories deemed as corrupt as a result of, for example, power outages while writing, file name overlength, and/or invalid characters in file name, are moved into
9638-410: The dominant networking model in use today. In September 1985, DEC became the fifth company to register a .com domain name (dec.com). Data backup In information technology , a backup , or data backup is a copy of computer data taken and stored elsewhere so that it may be used to restore the original after a data loss event. The verb form, referring to the process of doing so,
9760-502: The event of a disaster. Some organisations have their own data recovery centres, while others contract this out to a third-party. Due to high costs, backing up is rarely considered the preferred method of moving data to a DR site. A more typical way would be remote disk mirroring , which keeps the DR data as up to date as possible. A backup operation starts with selecting and extracting coherent units of data. Most data on modern computer systems
9882-402: The executable file. MS-DOS versions 5.x and later use chkdsk.exe as the executable file. CHKDSK can also show the memory usage, this was used before the command MEM.EXE was introduced in MS-DOS 4.0 to show the memory usage. In DR DOS the parameter /A limited the output to only show the memory usage. CHKDSK and UNDELETE in MS-DOS 5.0 have a bug which can corrupt data: If
10004-473: The fact that several competitors had just entered the market with machines aimed directly at the PDP-5's market space, which the PDP-8 trounced. This gave the company two years of unrestricted leadership, and eventually 1450 "straight eight" machines were produced before it was replaced by newer implementations of the same basic design. DEC hit an even lower price-point with the PDP-8/S, the S for "serial". As
10126-475: The first "R" (red) series " Flip-Chip " modules. Later, other Flip-Chip module series provided additional speed, much higher logic density, and industrial I/O capabilities. DEC published extensive data about the modules in free catalogs that became very popular. With the company established and a successful product on the market, DEC turned its attention to the computer market once again as part of its planned "Phase II". In August 1959, Ben Gurley started design of
10248-487: The fledgling company change its business plan to focus less on computers, and even change their name from "Digital Computer Corporation". The pair returned with an updated business plan that outlined two phases for the company's development. They would start by selling computer modules as stand-alone devices that could be purchased separately and wired together to produce a number of different digital systems for lab use. Then, if these "digital modules" were able to build
10370-411: The front of the modules. Three versions were offered, running at 5 MHz (1957), 500 kHz (1959), or 10 MHz (1960). The Modules proved to be in high demand by other computer companies, who used them to build equipment to test their own systems. Despite the recession of the late 1950s, the company sold $ 94,000 worth of these modules during 1958 alone (equivalent to $ 992,700 in 2023), turning
10492-413: The inspiration for many microcomputer OS's, as these were generally being written by programmers who cut their teeth on one of the many PDP-11 models. For example, CP/M used a command syntax similar to RT-11's, and even retained the awkward PIP program used to copy data from one computer device to another. As another historical footnote, DEC's use of "/" for "switches" (command-line options) would lead to
10614-490: The lab's various computer projects. The Lab is best known for their work on what would today be known as "interactivity", and their machines were among the first where operators had direct control over programs running in real-time. These had started in 1944 with the famed Whirlwind , which was originally developed to make a flight simulator for the US Navy , although this was never completed. Instead, this effort evolved into
10736-411: The last full backup and then apply the incrementals. Some backup systems can create a synthetic full backup from a series of incrementals, thus providing the equivalent of frequently doing a full backup. When done to modify a single archive file, this speeds restores of recent versions of files. Continuous Data Protection (CDP) refers to a backup that instantly saves a copy of every change made to
10858-548: The limited information available, they used it to process radar cross section data for the Lockheed A-12 reconnaissance aircraft . Gordon Bell remembered that it was being used in Oregon some time later, but could not recall who was using it. In November 1962, DEC introduced the $ 65,000 PDP-4 . The PDP-4 was similar to the PDP-1 and used a similar instruction set, but used slower memory and different packaging to lower
10980-441: The lines were shut down in early 1967. Like the PDP-1 before it, the PDP-5 inspired a series of newer models based on the same basic design that would go on to be more famous than its parent. On March 22, 1965, DEC introduced the PDP-8 , which replaced the PDP-5's modules with the new R-series modules using Flip Chips. The machine was re-packaged into a small tabletop case, which remains distinctive for its use of smoked plastic over
11102-404: The live copy, while storing the data necessary to reconstruct older versions. This can either be done using hard links —as Apple Time Machine does, or using binary diffs . A differential backup saves only the data that has changed since the last full backup. This means a maximum of two backups from the repository are used to restore the data. However, as time from the last full backup (and thus
11224-496: The machine would cost much less than the larger systems then available, it would also be able to serve users that needed a lower-cost solution dedicated to a specific task, where a larger 36-bit machine would not be needed. In 1957, when the pair and Ken's brother Stan sought capital, they found that the American business community was hostile to investing in computer companies. Many smaller computer companies had come and gone in
11346-438: The machine would spend more time accessing memory, which would slow it down. However, the machine also extended the idea of multiple "General Purpose Registers" (GPRs), which gave the programmer flexibility to use these high-speed memory caches as they needed, potentially addressing the performance issues. A major advance in the PDP-11 design was DEC's Unibus , which supported all peripherals through memory mapping . This allowed
11468-432: The mid-2000s, several drive manufacturers began to produce portable drives employing ramp loading and accelerometer technology (sometimes termed a "shock sensor"), and by 2010 the industry average in drop tests for drives with that technology showed drives remaining intact and working after a 36-inch non-operating drop onto industrial carpeting. Some manufacturers also offer 'ruggedized' portable hard drives, which include
11590-400: The most recent backup of any type (full or incremental). Changes in files may be detected through a more recent date/time of last modification file attribute , and/or changes in file size. Other variations of incremental backup include multi-level incrementals and block-level incrementals that compare parts of files instead of just entire files. Regardless of the repository model that is used,
11712-425: The name implies the /S used a serial arithmetic unit, which was much slower but reduced costs so much that the system sold for under $ 10,000. DEC then used the new PDP-8 design as the basis for a new LINC, the two-processor LINC-8 . The LINC-8 used one PDP-8 CPU and a separate LINC CPU, and included instructions to switch from one to the other. This allowed customers to run their existing LINC programs, or "upgrade" to
11834-458: The next April. The PDP-1 sold in basic form for $ 120,000 (equivalent to $ 9,269,291 in 2023). By the time production ended in 1969, 53 PDP-1s had been delivered. The PDP-1 was supplied standard with 4096 words of core memory , 18-bits per word, and ran at a basic speed of 100,000 operations per second. It was constructed using many System Building Blocks that were packaged into several 19-inch racks . The racks were themselves packaged into
11956-581: The price. Like the PDP-1, about 54 PDP-4s were eventually sold, most to a customer base similar to the original PDP-1. In 1964, DEC introduced its new Flip Chip module design, and used it to re-implement the PDP-4 as the PDP-7 . The PDP-7 was introduced in December 1964, and about 120 were eventually produced. An upgrade to the Flip Chip led to the R series, which in turn led to the PDP-7A in 1965. The PDP-7
12078-447: The rate of continuously writing or reading data can be very fast. While tape media itself has a low cost per space, tape drives are typically dozens of times as expensive as hard disk drives and optical drives . Many tape formats have been proprietary or specific to certain markets like mainframes or a particular brand of personal computer. By 2014 LTO had become the primary tape technology. The other remaining viable "super" format
12200-432: The rate of correctable minor data errors , of which consecutively too many increase the risk of uncorrectable sectors. Support for error scanning varies among optical drive vendors. Many optical disc formats are WORM type, which makes them useful for archival purposes since the data cannot be changed. Moreover, optical discs are not vulnerable to head crashes , magnetism, imminent water ingress or power surges ; and,
12322-476: The reliability of the data being backed up. There are limitations and human factors involved in any backup scheme. A backup strategy requires an information repository, "a secondary storage space for data" that aggregates backups of data "sources". The repository could be as simple as a list of all backup media (DVDs, etc.) and the dates produced, or could include a computerized index, catalog, or relational database . The backup data needs to be stored, requiring
12444-510: The report, the chkdsk /r command would cause the memory consumption to reach the maximum and the system to crash . Randall C. Kennedy of InfoWorld attributed the original report to " various Web sources " and said that in his tests, the memory consumption reached above 90%, although he did not experience a crash. Nevertheless, Kennedy took the memory consumption for a critical bug that would derail Windows 7's launch and chastised Microsoft. Tom Warren of Neowin dismissed Kennedy's assessment of
12566-607: The same design. During construction of the prototype PDP-1, some design work was carried out on a 24-bit PDP-2, and the 36-bit PDP-3. Although the PDP-2 never proceeded beyond the initial design, the PDP-3 found some interest and was designed in full. Only one PDP-3 appears to have been built, in 1960, by the CIA's Scientific Engineering Institute (SEI) in Waltham, Massachusetts . According to
12688-618: The size of the source data to be stored so that it uses less storage space. Compression is frequently a built-in feature of tape drive hardware. Redundancy due to backing up similarly configured workstations can be reduced, thus storing just one copy. This technique can be applied at the file or raw block level. This potentially large reduction is called deduplication . It can occur on a server before any data moves to backup media, sometimes referred to as source/client side deduplication. This approach also reduces bandwidth required to send backup data to its target media. The process can also occur at
12810-460: The target storage device, sometimes referred to as inline or back-end deduplication. Sometimes backups are duplicated to a second set of storage media. This can be done to rearrange the archive files to optimize restore speed, or to have a second copy at a different location or on a different storage medium—as in the disk-to-disk-to-tape capability of Enterprise client-server backup. High-capacity removable storage media such as backup tapes present
12932-483: The tendency of AUTOCHK to automatically modify the file system when not explicitly solicited by the user who may wish to back up their data in prior, as an attempted repair may scramble, undermine and disown file and directory paths, especially on a multiboot installation where multiple operating systems may have interferingly written to the same partition. Before the release of Windows 7 , InfoWorld reported an alleged memory leak in CHKDSK ; according to
13054-526: The time, Compaq was focused on the enterprise market and had recently purchased several other large vendors. DEC was a major player overseas where Compaq had less presence. However, Compaq had little idea what to do with its acquisitions, and soon found itself in financial difficulty of its own. Compaq subsequently merged with Hewlett-Packard (HP) in May 2002. Ken Olsen and Harlan Anderson were two engineers who had been working at MIT Lincoln Laboratory on
13176-489: The underlying organization of the machines from word lengths based on 6-bit characters to those based on 8-bit words needed to support ASCII . DEC began studies of such a machine, the PDP-X, but Ken Olsen did not support it as he could not see how it offered anything their existing 12-bit or 18-bit machines didn't. This led the leaders of the PDP-X project to leave DEC and start Data General , whose 16-bit Data General Nova
13298-467: The widely used TOPS-10 . When newer Flip Chip packaging allowed the PDP-6 to be re-implemented at a much lower cost, DEC took the opportunity to refine their 36-bit design, introducing the PDP-10 in 1968. The PDP-10 was as much a success as the PDP-6 was a commercial failure; about 700 mainframe PDP-10s were sold before production ended in 1984. The PDP-10 was widely used in university settings, and thus
13420-592: The workstation and file server market, the DEC Alpha product line began to make successful inroads in the mid-1990s, but was too late to save the company. DEC was acquired in June 1998 by Compaq in what was at that time the largest merger in the history of the computer industry. During the purchase, some parts of DEC were sold to other companies; the compiler business and the Hudson Fab were sold to Intel . At
13542-503: Was adapted to ReactOS by Emanuele Aliberti in 1999 and supports volumes using the FAT32 filesystem. The command does not support volumes using the Btrfs filesystem, although ReactOS supports it since version 0.4.1. Digital Equipment Corporation Digital Equipment Corporation ( DEC / d ɛ k / ), using the trademark Digital , was a major American company in
13664-478: Was either bought from the company that built the computer or custom-constructed for one client. However, the emerging third-party software industry disregarded the PDP-11/Professional line and concentrated on other microcomputers where distribution was easier. At DEC itself, creating better programs for the Professional was not a priority, perhaps from fear of cannibalizing the PDP-11 line. As a result,
13786-507: Was even sold in kit form as the Heathkit H11 , although it proved too expensive for Heathkit 's traditional hobbyist market. The introduction of semiconductor memory in the early 1970s, and especially dynamic RAM shortly thereafter, led to dramatic reductions in the price of memory as the effects of Moore's Law were felt. Within years, it was common to equip a machine with all the memory it could address, typically 64 KB on 16-bit machines. This led vendors to introduce new designs with
13908-623: Was eventually ported along with MS-DOS 2.0 and introduced in late 1983. Although the Rainbow generated some press, it was unsuccessful due to its high price and lack of marketing and sales support. By late 1983 IBM was outselling DEC's personal computers by more than ten to one. A further system was introduced in 1986 as the VAXmate , which included Microsoft Windows 1.0 and used VAX/VMS-based file and print servers along with integration into DEC's own DECnet -family, providing LAN/WAN connection from PC to mainframe or supermini. The VAXmate replaced
14030-636: Was followed by the even more successful VT100 and its follow-ons, making DEC one of the largest terminal vendors in the industry. This was supported by a line of inexpensive computer printers , the DECwriter line. With the VT and DECwriter series, DEC could now offer a complete top-to-bottom system from computer to all peripherals, which formerly required collecting the required devices from different suppliers. The VAX processor architecture and family of systems evolved and expanded through several generations during
14152-731: Was initially available only to DEC employees. It was only after IBM had successfully launched the IBM PC in 1981 that DEC responded with their own systems. In 1982, DEC introduced not one, but three incompatible machines which were each tied to different proprietary architectures. The first, the DEC Professional , was based on the PDP-11/23 (and later, the 11/73) running the RSX-11M+ derived, but menu-driven, P/OS ("Professional Operating System"). This DEC machine easily outperformed
14274-498: Was launched, effectively a PDP-8 on a chip. This was a way to allow PDP-8 software to be run even after the official end-of-life announcement for the DEC PDP-8 product line. While the PDP-5 introduced a lower-cost line, 1963's PDP-6 was intended to take DEC into the mainframe market with a 36-bit machine. However, the PDP-6 proved to be a "hard sell" with customers, as it offered few obvious advantages over similar machines from
14396-539: Was released in 1969 and was a huge success. The success of the Nova finally prompted DEC to take the switch seriously, and they began a crash program to introduce a 16-bit machine of their own. The new system was designed primarily by Harold McFarland, Gordon Bell , Roger Cady, and others. The project was able to leap forward in design with the arrival of Harold McFarland, who had been researching 16-bit designs at Carnegie Mellon University . One of his simpler designs became
14518-404: Was the PDP-11's first disk operating system, but was soon supplanted by more capable systems. RSX provided a general-purpose multitasking environment and supported a wide variety of programming languages . IAS was a time-sharing version of RSX-11D. Both RSTS and Unix were time-sharing systems available to educational institutions at little or no cost, and these PDP-11 systems were destined to be
14640-454: Was the basis of many advances in computing and operating system design during the 1970s. DEC later re-branded all of the models in the 36-bit series as the "DECsystem-10", and PDP-10s are generally referred to by the model of their CPU, starting with the "KA10", soon upgraded to the "KI10" (I:Integrated circuit); then to "KL10" (L:Large-scale integration ECL logic ); also the "KS10" (S: Small form factor ). Unified product line upgrades produced
14762-535: Was the dual-processor (Z80 and 8088) Rainbow 100 , which ran the 8-bit CP/M operating system on the Z80 and the 16-bit CP/M-86 operating system on the Intel 8088 processor. It could also run a UNIX System III implementation called VENIX . Applications from standard CP/M could be re-compiled for the Rainbow, but by this time users were expecting custom-built (pre-compiled binary) applications such as Lotus 1-2-3 , which
14884-441: Was used as a printer . The Soroban system was notoriously unreliable, and often replaced with a modified Friden Flexowriter , which also contained its own punched tape system. A variety of more-expensive add-ons followed, including magnetic tape systems, punched card readers and punches, and faster punched tape and printer systems. When DEC introduced the PDP-1, they also mentioned larger machines at 24, 30 and 36 bits, based on
#986013