Misplaced Pages

CDF

Article snapshot taken from Wikipedia with creative commons attribution-sharealike license. Give it a read and then ask your questions in the chat. We can research this topic together.

NetCDF ( Network Common Data Form ) is a set of software libraries and self-describing, machine-independent data formats that support the creation, access, and sharing of array-oriented scientific data. The project homepage is hosted by the Unidata program at the University Corporation for Atmospheric Research (UCAR). They are also the chief source of netCDF software, standards development, updates, etc. The format is an open standard . NetCDF Classic and 64-bit Offset Format are an international standard of the Open Geospatial Consortium .

#496503

27-830: CDF is a three-letter acronym that may refer to: Mathematics, science, and computers [ edit ] .cdf (formerly known as "AIA Format"), the ANDI/ netCDF scientific data interchange file format Cation diffusion facilitator , transport protein Channel Definition Format , an XML standard Chip Description File , a genomic analysis file format Chlorinated dibenzofuran(s), a.k.a. polychlorinated dibenzofurans Cohen–Daubechies–Feauveau wavelet Collider Detector at Fermilab Comma-delimited format, now referred to as .csv ( comma-separated values ) Common Data Format , NASA software Composite Document File ,

54-513: A CMake build system for Windows builds. Version 4.7.0 (2019) added support for reading Amazon S3 objects. Version 4.8.0 (2021) added further support for Zarr . Version 4.9.0 (2022) added support for Zstandard compression. Further releases are planned to improve performance, add features, and fix bugs. The format was originally based on the conceptual model of the Common Data Format developed by NASA , but has since diverged and

81-514: A Microsoft compound document file format Compound Document Format , a set of W3C standards on a specific compound document format Computable Document Format , a format for interactive data visualizations Core Damage Frequency, a term used in probabilistic risk assessment for nuclear power plants Cumulative distribution function Organizations [ edit ] California Department of Forestry and Fire Protection Canal del Fútbol (Chile) Ceylon Defence Force Chief of

108-595: A cruise ship operator Chesapeake Detention Facility Clostridioides difficile ESA's Concurrent Design Facility at ESTEC Congolese franc , the ISO 4217 code for the currency of the Democratic Republic of Congo Constructive developmental framework for psychological assessment Carlos Diego Ferreira , a Brazilian mixed martial artist Topics referred to by the same term [REDACTED] This disambiguation page lists articles associated with

135-487: A longer list is on the UCAR website. It is commonly used in climatology , meteorology and oceanography applications (e.g., weather forecasting , climate change ) and GIS applications. It is an input/output format for many GIS applications, and for general scientific data exchange. To quote from their site: The Climate and Forecast (CF) conventions are metadata conventions for earth science data, intended to promote

162-418: A mix of related objects which can be accessed as a group or as individual objects. Users can create their own grouping structures called "vgroups." The HDF4 format has many limitations. It lacks a clear object model, which makes continued support and improvement difficult. Supporting many different interface styles (images, tables, arrays) leads to a complex API. Support for metadata depends on which interface

189-492: A proliferation of different data models, including multidimensional arrays, raster images , and tables. Each defines a specific aggregate data type and provides an API for reading, writing, and organizing the data and metadata. New data models can be added by the HDF developers or users. HDF is self-describing, allowing an application to interpret the structure and contents of a file with no outside information. One HDF file can hold

216-758: Is built upon MPI-IO , the I/O extension to MPI communications. Using the high-level netCDF data structures, the Parallel-NetCDF libraries can make use of optimizations to efficiently distribute the file read and write applications between multiple processors. The Parallel-NetCDF package can read/write only classic and 64-bit offset formats. Parallel-NetCDF cannot read or write the HDF5-based format available with netCDF-4.0. The Parallel-NetCDF package uses different, but similar APIs in Fortran and C. Parallel I/O in

243-592: Is in use; SD (Scientific Dataset) objects support arbitrary named attributes, while other types only support predefined metadata. Perhaps most importantly, the use of 32-bit signed integers for addressing limits HDF4 files to a maximum of 2 GB, which is unacceptable in many modern scientific applications. The HDF5 format is designed to address some of the limitations of the HDF4 library, and to address current and anticipated requirements of modern systems and applications. In 2002 it won an R&D 100 Award . HDF5 simplifies

270-436: Is not compatible with it. The netCDF libraries support multiple different binary formats for netCDF files: All formats are " self-describing ". This means that there is a header which describes the layout of the rest of the file, in particular the data arrays , as well as arbitrary file metadata in the form of name/value attributes . The format is platform independent , with issues such as endianness being addressed in

297-511: Is said to implement a common data model for scientific datasets. The Java common data model has three layers, which build on top of each other to add successively richer semantics: The data model of the data access layer is a generalization of the NetCDF-3 data model, and substantially the same as the NetCDF-4 data model. The coordinate system layer implements and extends the concepts in

SECTION 10

#1732765605497

324-501: Is still actively supported by UCAR. The original netCDF binary format (released in 1990, now known as "netCDF classic format") is still widely used across the world and continues to be fully supported in all netCDF releases. Version 4.0 (released in 2008) allowed the use of the HDF5 data file format. Version 4.1 (2010) added support for C and Fortran client access to specified subsets of remote data via OPeNDAP . Version 4.3.0 (2012) added

351-597: Is very similar across the different languages, apart from inevitable differences of syntax. The API calls for version 2 were rather different from those in version 3, but are also supported by versions 3 and 4 for backward compatibility. Application programmers using supported languages need not normally be concerned with the file structure itself, even though it is available as open formats. A wide range of application software has been written which makes use of netCDF files. These range from command line utilities to graphical visualization packages. A number are listed below, and

378-453: Is written in 100% Java , which extends the core data model and adds additional functionality. Interfaces to netCDF based on the C library are also available in other languages including R ( ncdf , ncvar and RNetCDF packages), Perl Data Language , Python , Ruby , Haskell , Mathematica , MATLAB , Interactive Data Language (IDL), Julia and Octave . The specification of the API calls

405-646: The Climate and Forecast Metadata Conventions . The scientific data type layer allows data to be manipulated in coordinate space, analogous to the Open Geospatial Consortium specifications. The identification of coordinate systems and data typing is ongoing, but users can plug in their own classes at runtime for specialized processing. HDF5 Hierarchical Data Format ( HDF ) is a set of file formats ( HDF4 , HDF5 ) designed to store and organize large amounts of data. Originally developed at

432-655: The HDF5 format can be read, with some restrictions. Data in the HDF4 format can be read by the netCDF C library if created using the HDF4 Scientific Data (SD) API. The NetCDF-Java library currently reads the following file formats and remote access protocols: There are a number of other formats in development. Since each of these is accessed transparently through the NetCDF API, the NetCDF-Java library

459-418: The software libraries . The data are stored in a fashion that allows efficient subsetting. Starting with version 4.0, the netCDF API allows the use of the HDF5 data format. NetCDF users can create HDF5 files with benefits not available with the netCDF format, such as much larger files and multiple unlimited dimensions. Full backward compatibility in accessing old netCDF files and using previous versions of

486-445: The C and Fortran APIs is supported. The software libraries supplied by UCAR provide read-write access to netCDF files, encoding and decoding the necessary arrays and metadata. The core library is written in C , and provides an application programming interface (API) for C, C++ and two APIs for Fortran applications, one for Fortran 77 , and one for Fortran 90. An independent implementation, also developed and maintained by Unidata,

513-906: The Defence Force (disambiguation) , position in multiple countries Children's Defense Fund (U.S.) Chinland Defense Force , an insurgent group fighting against the military junta in Chin State , Myanmar Ciskei Defence Force "Congregation for the Doctrine of the Faith (CDF)": former name of Dicastery for the Doctrine of the Faith (DDF) Constituency Development Fund Cooperative Development Foundation (U.S.) Crypto Developers Forum Transport [ edit ] Cardiff Central railway station , National Rail station code CDF Cortina Airport , Italy (IATA code CDF) Other [ edit ] CDF Croisières de France ,

540-587: The National Center for Supercomputing Applications (NCSA). NSF grants received in 1990 and 1992 were important to the project. Around this time NASA investigated 15 different file formats for use in the Earth Observing System (EOS) project. After a two-year review process, HDF was selected as the standard data and information system. HDF4 is the older version of the format, although still actively supported by The HDF Group. It supports

567-529: The U.S. National Center for Supercomputing Applications , it is supported by The HDF Group, a non-profit corporation whose mission is to ensure continued development of HDF5 technologies and the continued accessibility of data stored in HDF. In keeping with this goal, the HDF libraries and associated tools are available under a liberal, BSD-like license for general use. HDF is supported by many commercial and non-commercial software platforms and programming languages. The freely available HDF distribution consists of

SECTION 20

#1732765605497

594-568: The Unidata netCDF library has been supported since release 4.0, for HDF5 data files. Since version 4.1.1 the Unidata NetCDF C library supports parallel I/O to classic and 64-bit offset files using the Parallel-NetCDF library, but with the NetCDF API. The netCDF C library, and the libraries based on it (Fortran 77 and Fortran 90, C++, and all third-party libraries) can, starting with version 4.1.1, read some data in other data formats. Data in

621-489: The file format, HDF5 includes an improved type system, and dataspace objects which represent selections over dataset regions. The API is also object-oriented with respect to datasets, groups, attributes, types, dataspaces and property lists. The latest version of NetCDF , version 4, is based on HDF5. Because it uses B-trees to index table objects, HDF5 works well for time series data such as stock price series, network monitoring data, and 3D meteorological data. The bulk of

648-535: The file structure to include only two major types of object: This results in a truly hierarchical, filesystem-like data format. In fact, resources in an HDF5 file can be accessed using the POSIX -like syntax /path/to/resource . Metadata is stored in the form of user-defined, named attributes attached to groups and datasets. More complex storage APIs representing images and tables can then be built up using datasets, groups and attributes. In addition to these advances in

675-573: The library, command-line utilities, test suite source, Java interface, and the Java-based HDF Viewer (HDFView). The current version, HDF5, differs significantly in design and API from the major legacy version HDF4. The quest for a portable scientific data format, originally dubbed AEHOO (All Encompassing Hierarchical Object Oriented format) began in 1987 by the Graphics Foundations Task Force (GFTF) at

702-907: The processing and sharing of files created with the NetCDF Application Programmer Interface (API). The conventions define metadata that are included in the same file as the data (thus making the file "self-describing"), that provide a definitive description of what the data in each variable represents, and of the spatial and temporal properties of the data (including information about grids, such as grid cell bounds and cell averaging methods). This enables users of data from different sources to decide which data are comparable, and allows building applications with powerful extraction, regridding , and display capabilities. An extension of netCDF for parallel computing called Parallel-NetCDF (or PnetCDF) has been developed by Argonne National Laboratory and Northwestern University . This

729-463: The title CDF . If an internal link led you here, you may wish to change the link to point directly to the intended article. Retrieved from " https://en.wikipedia.org/w/index.php?title=CDF&oldid=1254436704 " Category : Disambiguation pages Hidden categories: Short description is different from Wikidata All article disambiguation pages All disambiguation pages NetCDF The project started in 1988 and

#496503