Home > Article > Progress in U S Government Information Technology

Progress in U S Government Information Technology

Table of ContentsPectionNumberAbout the editogyThe38Telemedicine and TelehealtRadio frequency Identification (RFIDTechne79Artificial intelCybersecurity Progress in the UsGovernmenThe Federal Cybersecurity Reskilling33U

S Government Efforts to manage eWaste

more than a billion dollars in research in addition thecollaboration has promoted activities within NIH such as theadoption of modern data and software sharing practices so thatrch are properly disseminated to the researche Neuroscienceinventory of Web-based neuroscience resources: data, materialsnd tools accessible via any computer connected to the internetthe Nih Blueprint for NeurosciencNIF advancesby enabdisaccess to public research data and tools worldwide through anopen source, networked environmentThe Nih Human Connectome Project is an ambitious efforthe neural pathyhat underlie hibrain function andto share data about the structural and functional connectivitythe human brain The project will lead to ma jor advances in ourunderstanding of what makes us uniquely human and will set thestage for future studies of abnormal brain circuits in manye Worldwide Protein Data Bank(wwPDB), a repository foriving and free distribution of high qualitnacromolecltimely basis, represents the preeminent source of experimentallyedand teaching in biology, biological chemistry, and medicineThe Us

component of the project(RCSB PDB)is jointlyfunded by five Institutes of NIH, DOE/BER and NSF, as well asarticipants in the UK and Japan The single daontains experimental structures and related annotation fo80000 macromolecular structures The Web site receives211,000 uniqumonth from 140 different countries

Around I terabyte of data are transferred each month from thewebsiteThe biomedical informatics Research nek (birn,anational initiative to advance biomedical research through dataprovidesdriven, softwarbased framework for research teams to share significantuantities of data-rapidly, securely and privately-acrossgeographic distance and/or incompatible computing systems,serving diverse research communitiesThe National Archive of Computerized Data on Aging(NACDA) program advances research on aging by helpingesearchers to profit from the under-exploited potential of aad range of datasets NACd preserves and makes availablethe largest library of electronic datathe unitedThe Collaborative Research in Computational Neuroscience(CRCNS) is a joint NIH-NSF program to support collaborativeresearch projects between computational scientists andneuroscientists that will advance the understanding of nervoussystem structure and function, mechanisms underlying nervoussystem disorders and computational strategies used by thenervous systemcent years, the German Federal Ministryd Research has also joined thech in germanyCore Techniques and Technologies for Advancing Big DataScience Engineering(BIGDATA)is a new joint solicitationeen nsf and nih that aims to advance the core scientifiand technolovisualiznd extracting useful information from large, diversedistributed and heterogeneous data sets

Specifically, it wil

upport the development and evaluation of technologies andools for data collection and management, data analytics, and/oe-science collaborations, which will enable breakthroughdiscoveries and innovation in science, engineering, andmedicine-laying the foundations for US competitiveness formany decades to come

Cyber Infrastructure Framework for 21st Century Science andfforts across NsF to create meaningful cyber infrastructurewell as develop a level of integration and interoperability of dataCIF21 Track for IGERT NSF has shared with its communitylans to establish a new CiF2I track as part of its IntGraduate Education and Research Traineeship(IGErT)program This track aims to educate and support a newgeneration of researchers able to address fundamental Big datachallenges concerning core techniques and technologies,Data citationopportunities for the use and analysis of data sets, wasencouraged in a dear colleague letter initiated by NSF'sGeosciences directorate, demonstrating NSF's commitment tosponsible stewardship and sustainability of data resulting fronfederally funded researclData and Software Preservation for Open Science(dASPOS)isfirst attemptts at the lhc and fermilab/Tevatron withexperts in digital curation, heterogeneous high-throughpustorage systems, large-scale computing systems, and grid access

and infrastructure The intent is to define and executeset of well-defined entrant- scale activities on which to base aalcommonality among various scienDigging into data Challenge addresses how big data changthe research landscape for the humanities and social sciences, inew,computationally-based research methods are needeto search, analyze, and understand massive databases oftransactional data from web searches, sensors and cell phoecords

Administered by the national endowment for theHumanities, this Challenge is funded by multiple US andinternational organizationsThe USGS John Wesley Powell Center for Analysis andSynthesis announced eight new research projects fortransforming big data sets and big ideas about earth sciencetheories into scientific discoveries at the center scientistsaborate to perform state-of-the-art synthesis to leveragecomprehensive, long-term data

Quantum ComputingIn July 2016, the National Science and Technology Council ofthe Executive Office of the President, in a report titledAdvancing Quantum Information Science: National Challengesnd Opportunities", described Quantum Information Science(QIs)is a foundational science, with envisioned applications(that) include sensing and metrology, communicationsn,and high-performance computing The report alsoointed out specifically that Quantum communication, theansmit information encoded in quantuates olight or matter, is currently an active area of development Thereport also states that in thenetworks willonnect distributed quantum sensors to allow long-distancetransmission of quantum informationQuantum information science combines two of the greatntific and technological revolutions of the 20th century:quantum mechanics on the one hand, and computer-basedinformation science on the other

One of the fundamentallyiportant research areas involved in quantum informationcience is quantum communications, which deals with thexchange of information encoded in quantum states of matteruantum bits(known as qubits) between both nearby and distantuantum systemsQuantum computing is based on quantum bits or qubits Unliketraditional computers, in which bits must have a value of eitherboth valuimultaneously Representing information in qubits allows theinformation to be processed in ways that have no equivalentclassical computing, taking advantage of phenomena such asquantum tunneling and quantum entanglement As suchuantum computers may theoretically be able to solve certainblems in a few days that would take millions of years on a

Qubits are the quantum analogue to the classical computer bitsand"1 Engineering materials that can function as qubits istechnically challenging Using supercomputers, scientists fromthe University of Chicago and Argonne National Laboratorypredicted possible new qubits builtf strained aluminumnitride Moreover the scientists showed that certain newlydeveloped qubits in silicon carbide have unusually longQuantum computers could break common cryptographytechniques, search huge datasets, and simulate quantum systemsa fraction of the time it would take todays computersHowever, engineers first need to harness the properties ofquantum bits Engineering new qubits with less difficultmethods could lower one of the significant barriers to scalingquantum computers from small prototypes into larger-scaleOne of the leading methods for creating qubits invexploiting specific structural atomic defects in diamonds

Usingdiamonds is both technically challenging and expensive Nowesearchers from the University of Chicago and ArgonneNational Laboratory have suggested an analogous defect inaluminum nitride which could reduce the difficulty and ultimatef manufacturingIs for quantum computingapplications Using the Edison and Mira supercomputers atEs National Energy Research Scientific Computing Centerand Argonne National Laboratory respectively, the researcherbund that by applying strain to aluminum nitride they cacreate structural defects in the material that may be harnessed asubits similar to those seen in diamonds they performed theircalculations using different levels of theory and the QuantumEspresso and WEST codes, the latter developed at theUniversity of Chicago The codes allowed them to accuratelyedict the position of the defect levels in the band-gap ofemiconductors, The researchers also closely collaborated withexperimentalists to understand and improve the performance ofubits in industrial materials Recently, they showed that newlydeveloped qubits in silicon carbide have much longer coherence

times than that of the more wtablished defectdiamond Their results pointed to industrially importantngP994 breakthrough discovery of a polynomial timeuantum algorithm for integer factorization sparked greatinterest in discovering additional quantum algorithms anddeveloping hardware on which to run them The subsequentesearch efforts yielded quantum algorithms offering speedupswidely varying problems, and several promising hardwareplatforms for quantum computation These platforms includeanalog systems(usually cold atoms)used for simulatingquantum lattice models from condensed-matter and high-energyphysics, quantum annealers for combinatorial optimization,mode samplers, and small-scale noisy prototypes of digital gate-dl quantum coIn the longer term, the emergence of scalable, fault-tolerantdigital quantum computers offers a new direction for progress inhigh performance computing as conventional technologies reachtheir fundamental limitations

Quantum speedups have beediscovered for a number of areas of doe interest, includinguclear and particle physics, andmaterials science, as well as data analysis and machine learningaddition, quantum speedups have been discovered for basicprimitives of applied mathematics such as linear algebra,integration, optimization, and graph theory These demonstratethe potential of quantum computerd better-scalingmethods (in some cases exponentially better)for performing awide variety of scientific computing tasks Practical realizationf this potential will depend not only on advances in quantumomputing hardware but also advances in optimizing languagesand compilers to translate these abstract algorithms into concretesequences of realizable quantum gates, and simulators to testnd verify these sequences The development of such softwhas recently seen rapid progress, which can be expected tocontinue given sufficient support

about the editorMichael Erbschloe has worked for over 30 years perfoganalysis of the economics of information technology, publichnology, and utilizing technology ineengineering organization processes He has

authored seveuestechno logublished by Mcgraw Hill and otherhas also taughtes andeveloped technology-related curriculum

His career hasfocused on several interrelated areasTechnology strategy, analysis, and forecastingTeaching and curriculum developmentWriting books and articlesPublishing and editingPublic policy analysis and program evaluationBooks by michael erbschloeExtremist Propaganda in Social Media: A Threat to Hd Security( CRC PressThreat Level Red: Cybersecurity Research Programs of the US GorfoWalling Out the Insiders: Controlling Access to Improve Organizationalecurity(Crysical Security for IT(Elsevier Science)e(Butterworth-Heinemann)ng Homeland Security in Enterprise IT(Digital Press)rmation Warfare: heSurvive Cyber Attacks (McGraw HiThe Executive's Guide to Privacy Management(McGraw Hilly: A Guide to Devele-businePlan (McGraw hill

Big data InitiativesMany companies are sponsoring Big Data-related competitions,and providing funding for university research Universities arecreating new courses and entire courses of study to prepare thenext generation of data scientists Organizations like Datawithout Borders have helped by providincollection, analysis, and visualization There have also beenUS Federal government programs that address the challengesof, and tap the opportunities afforded by, the big data revolutionadvance agency missions and further scientific discovery andinnovation

This paper presents a small sampling of theactivities the governmenThe Department of Defense (DOD)was investing $250 millionharness and utilize massive data in new ways and bring togethersensing, perception and decision support to make trulyautonomous systems that can maneuver and make decisions ontheir own This has included improving situational awareness toIp warfighters and analysts provide increased support tooperations DOd has been seeking a 100-fold increase in thests to extract information from texts in anyanguage, and a similar increase in the number of objectand events thathalys can observeThe Defense Advanced Research Projects Agency has pursuedthe Anomaly Detection at Multiple Scales(ADAMs) programaddresses the problem of anomaly-detection andcharacterizationmassive data sets In this context anomaliin data are intended to cue collection of additional actionableinformation in a wide variety of real-world contexts The initial

ADAMS applicader -threat detection in whichanomalous actions by an individual are detected againsthe Cyber-InsideThreat(CINDER) program seeks to develop novel approachescomputer networks while the Insight program addresses keyshortfalls in current intelligence, surveillance andeconnaissance systems and aims to develop aement systestem to automatically identify thietworksnd irregular warfare operations through the anaThe darPa Mission-oriented Resilient Clouds program aimsaddress security challenges inherent in cloud computing bydeveloping technologies to detect, diagnose and respond toattacks, effectively building a " community health systemthe cloud The program also aims to develop technologieenable cloud applications and infrastructure to continuefunctioning while under attack The loss of individual hosts andtasks within the cloud ensemble would be allowable as long asverall mission effectiveThe daRPa Video and Image Retrieval and Analy(VIRAT) program aims to develop a system to provide militaryf overhead video content being collected

Ifful viratwill enable analysts to establish alerts for activities and eventsterest as they occur ViRAT also seeks to develop tools thatwould enable analysts to rapidly retrieve, with high precisionand recall, video content from extremely large video librariesThe dhS Center of Excellence on visualizaAnalytics(CVADA), has supported research efforts on large

heterogeneous data that First Responders could use to addressissues ranging from manmade or natural disasters to terroristconcerThe doe office of Advanced Scientific Computing ResearchASCR) provides leadership to the data managementlizand data analypreservation and community access, Programs within the suiteclude widely used data management technologies such as theKepleric workflow system; andResManagement standard; a variety of data storage managemerechnoloch as bestman the bulk data mover and theAdaptable Io System(ADIOS); Fast Bit data indexingtechnology(used by Y ahoo! ) and two major scientificvisualization tools para view and vislMathematics for Analysis of Petascale Data addresses themathematical challenges of extracting insights from hugecientific datasets and finding key features and understandingthe relationships between those features

Research areas includemachine learning, real-time analysis of streaming data,ochastic nonlinear data-reduction techniques and scalablestical analysis techniques applicable to a broad range ofDOE applications including sensor data from the electric gridcosmology and climate dataThe Office of Basic Energy Sciences(BES) Scientific UserFacilities have supported a number of efforts aimed at assistingers with data management and analysis of big data, which canbe as big as terabytes(10 12 bytes) of data per day from a singleperiment For example, the Accelerating Data Acquisitionduction and Analysis(ADARA)project addresses the dataworkflow needs of the Spallation Neutron Source(SNS)data

system to provide real-time analysis for experimental contrnd the Coherent X-ray Imaging Data Bank has been created tomaximize data availability and more efficient use of synchrotronsourcesThe office of Fusion Energy Sciences(FES) has supported theScientific Discovery through Advanced Computing (SciDACpartnership between FES andffice of Advanced scComputing Research(AsCr) addresses big data challengessociated with computational and experimental researchergy science The dateveloped by the AsCr-FES partnerships include highperformance input/output systems, advanced scientific workflowenance frameworks, and visualization techniquaddressing thefusion needs which have attracted theattention of European integrated modeling efforts and ITER, ane Office of Scientific and Technical Information(OSTI), thenly U

s Federal agency member of Data Cite(a globalconsortiumechnical implementations of the practice of data citation, whichables efficient reuse and verification of data so that the impactof data may be tracked, and a scholarly structure that recognizesand rewards data prosaucers maestablishedThe Consortium for Healthcare Informatics Research(CHIRhas worked to develop Natural Language Processing (NLItools in order to unlock vast amounts of information thatcurrently stored in Va as text data Meanwhile AViva is theVAs next generation employment human resources system thewill separate the database from the business applications andfrom the browser-based user interface Analytical tools are

already being built upon this foundation for research andultimately support of decisions at the patient encounterThe cDisease c(CDC)haspursued BioSense 20 which was the first system to take intonetwork of systems, buisting state and local capabilitiesBioSense 2

0 removes many of the costs associated witharchitecture while still making thedistributed aspects of the system transparent to end userswell as making data accessible for appropriate analyses andeportingcaid Services(CMS)hachouse based on hadobeing developedto support analytic and reporting requirements from Medicareand Medicaid programsajor goal is to developsupportable, sustainable, and scalable design that accommodatesaccumulated data at the warehouse level also challenging isg a solution complements existing technologiese FDa Virtual Laboratory Environment (VLE) was designednd capabable a virtualaboratory data network, advanced analytical and statistical toolsanalytics to predict andte public health, documepport, telepresence capability to enable worldwide collaboration, andally make any location a virtual laboratory with advancedapabilities in a matter of hoursThe National Archives Records Administration (NARa) hasworked to develop a Cyberinfrastructure for a Billion ElectronicRecords( ci-Ber) through a joint agency sponsored testbed

notable for its application of a multi-agency sponsored cybernfrastructure and the national archives diverse 87+ million filecollection of digital records and information now active at theRenaissance Computing Institute This testbed will evaluatetechnologies and approaches to support sustainableaccess toultra-arge data collectionsNASAS Advanced Information Systems Technology(AIST)Wards seek to reduce the risk and cost of evolving nasaormation systems to support future Earth observatiod to transform observations into earth information asenvisioned by NASAs Climate Centric Architecture SomeAIsT programs seek to mature Big Data capabilities to reducethe risk, cost, size and development time of Earth ScienceDivision space-based and ground-based information systemsnd increase the accessibility and utility of science datNASAS Earth Science Data and Information System (ESDIs)project, active for over 15 years has worked to process, archive,nd distribute earth science satellite data and data from airborneand field campaigns

With atterstrives to ensure that scientists and the public have accedata to enable the study of Earth from space to advance Earthsystem science to meet the challenges of climate anenvironmental changeThe Global Earth Observation System of Systems(GEoSS)iscollaborative, international effort to share and integrate earthbservation data nasa has joined forces with the usEnvironmental Protection Agency(EPA), National Oceanic andAtmospheric Administration(NOAA) other agencies andnations to integrate satellite and ground-based monitoring andmodeling systems to evaluate environmental conditions andpredict outcomes of events such as forest fires, population

growth and other developments that are natural and man-madeIn the near-term, with academia, researchers will integrate abetter understandnd address the impact of air quality on the environment and(TCIA)is an image data-sharing service that facilitates opencience in the field of medical imaging TCIA aims to improvecreasing the efficiency and reproducibility ofetection and dimaging to provideobjective assessment of therapeutic response, and ultimatelyenabling the development of imaging resources that will lead toational Institute of Biomedical Imaging and Bioengineering(NIBIB) has supported the Development and Launch of anInteroperable and Curated Nanomaterial Registry, led by theNIBIB institute seeks to estananomaterial registry,primaryinformation on the biological and envirwell-characterized nanomaterials

as well as links to associatedmanufacturing guidance The registry facilitates buildingcharacterizing nanomaterialsNIH Biomedical Information Science and Technology Initiative(BISTI Consortium for over a decade has joined the institutesnd centers at Nih to promote the nation s researchformatics and Computational Biology(BICB)promoted a number of program announcements and funded