CRA frequently talks about the need for more basic scientific research but we focus almost exclusively on governmental research investment. We talk about the fall of DARPA and the need for NSF to increase to compensate. We don't spend quite as much time talking about industry investment in basic research. An article in Business Week points out the necessity of industry participation in the research ecosystem and the rich history of corporate laboratories' basic research contributions. It's a very interesting article that weaves together the past and present research ecosystems, today's economic concerns, and suggestions for tackling the problems we see today.
The article discusses the two times in US history when the government spurred scientific innovation in a short period of time - the Manhattan Project and the Apollo space mission - and the reasons they were so successful. It states, "Their success can be mapped to five crucial success factors: 1) full and sustained Presidential support; 2) effective leadership with a clearly defined mandate; 3) access to resources; 4) parallel paths/processing to save time; and 5) private sector outsourcing."
It also discusses the best basic research model which it says combines universities' research efforts and "a dynamic public-private network of labs and a venture capital industry waiting downstream to commercialize ideas and turn them into large public companies that create hundreds of thousands of new jobs. Here's what's needed to get that model back on track:
The article is a good read with good historical background and ideas for the present.
The President used a speech before the members of the National Academy of Sciences today to reiterate his commitment to boosting the U.S. investment in science and technology.In his remarks before the opening session of the National Academy's annual meeting, Obama set a goal of seeing the U.S. invest 3 percent or more of its annual GDP in basic and applied scientific research funding. This level of investment would represent the largest investment in American history -- an even larger share of GDP than the U.S. invested during the space race of the 1950s and 60s. Here's a choice quote from AP coverage of the speech:
The pursuit of discovery a half century ago fueled the nation's prosperity and success, Obama told the academy.According to a White House fact sheet distributed after his remarks today, the President plans to back up his rhetoric with a number of budgetary commitments, including:"The commitment I am making today will fuel our success for another 50 years," he said. "This work begins with an historic commitment to basic science and applied research."
He set forth a wish list for the future including "learning software as effective as a personal tutor; prosthetics so advanced that you could play the piano again; an expansion of the frontiers of human knowledge about ourselves and world the around us.
"We can do this," Obama said to applause.
The President also used the occasion to name the members of his President's Council of Advisors for Science and Technology (PCAST) -- a committee of representatives from science and industry who will examine aspects of federal science policy and make recommendations to the President. For the last several years, PCAST has also assumed the statutory responsibilties of the President's Information Technology Advisory Committee (PITAC), which was dissolved as a free-standing committee under President Bush (though there may be a move to reestablish the free-standing committee -- more on that in a future post).A commitment to finish the 10-year doubling of 3 key science agencies (National Science Foundation, Department of Energy's Office of Science, and the National Insititutes of Standards and Technology). Between 2009 and 2016, the Administration's enacted and proposed budgets would add $42.6 billion to the 2008 budgets for these basic research agencies, with a special emphasis on encouraging high-risk, high-return research and supporting researchers at the beginning of their careers. The launch of the Advanced Research Projects Agency-Energy (ARPA-E). ARPA-E is a new Department of Energy organization modeled after the Defense Advanced Research Projects Agency, the defense agency that gave us the Internet, stealth aircraft, and many other technological breakthroughs. A joint initiative by the Dept. of Energy and NSF that will inspire tens of thousands of American students to pursue careers in science, engineering, and entrepreneurship related to clean energy programs and scholarships from grade school to graduate school.
Among the new PCAST members are at least four from the computing community:
The President's commitment to continuing the very recent robust increases for federal R&D -- after several years of real-dollar declines -- along with recent statements by key Senate appropriations staff who believe 7 percent annual increases for NSF are "sustainable," give us reason to be somewhat optimistic going into the appropriations season this year. However, as always, other pressing concerns and shortfalls in the federal budget can adversly affect science funding despite all the apparent support, so we'll be keeping a close eye on the process. But Obama's initial steps here may turn out to be giant ones for U.S. science and innovation.
CRA member Google Inc.'s CEO Eric Schmidt gave a speech yesterday in DC regarding government and technology. Schmidt is a member of President-elect Obama's transition team but he focused more on issues that the technology community (including CRA) has been talking about for years, including research funding. The Washington Times has all the details but here's a brief quote on research:
Mr. Schmidt said the government has an important role to play in funding research, noting that businesses "by law have to serve their shareholders" and therefore are not going to "fundamentally invest at the level of pure research.""It takes government policy. That model works," he said, citing a pledge by Mr. Obama to double basic spending on scientific research, which declined this year.
Check out the article for more on what Schmidt talked about or listen to the speech at the New America Foundation (mp3 format download).
Today, as part of CRA's mission to improve public and policymaker understanding of the importance of computing and computing research, we're pleased to announce the launch of a new feature on the CRA and CCC web pages: the Computing Research Highlight of the Week. Each week, we'll highlight some of the exciting and important research results recently generated by the computing community.
Our first highlight features a new algorithm developed by researchers at the Jacobs School of Engineering at UC San Diego that promises to significantly boost the efficiency of network routing.
We hope to accomplish a few things with these highlights. First, we want to show off the good work being done in our community in a way that is accessible to the general public. One model for this is the very popular Astronomy Photo of the Day, where each day a new photo or graphic (or video) having something to do with astronomy is featured along with a succinct description. We hope to do the same for computing. Second, we hope to build up a good database of examples of the vibrancy of the computing fields that we can use in our advocacy efforts with Congress, the Adminstration, and federal agencies. Having a collection of easily accesible and digestable research "nuggets" helps us immeasurably when trying to make the case for computing research to policymakers. And thirdly, we want to make sure our members of our own community are aware of some of the wide variety of interesting research results that are being generated across the various sub-disciplines of computing, and perhaps even make connections to their own work.
We've tried to make it easy for you to keep track of the current weekly highlight with an RSS feed, an email notification system, and even embed code that allows you to feature the highlight of the week on your own web page. Each week's highlight also features prominently on both the CRA and CCC home pages.
So how do you get your own work featured as a Computing Research Highlight of the Week? It's easy: just submit it! From those submissions CRA and CCC staff and volunteers chose a new highlight each week. We're pleased that so many answered our call last July for your research highlights, but we want more. So submit your interesting and important research results today!
Computerworld has published a great couple of articles this week regarding the next Administration, technology, and US innovation. They feature a number of folks well-known in the CS community and are definitely worth checking out.
Dear Mr. President: Let’s Talk Tech
From the explanatory statement for the Continuing Resolution that will fund government agencies until March 6, 2009:
Defense Advanced Research Projects Agency (DARPA)Wired's Noah Shactman, writing for the Danger Room blog, has more.The fiscal year 2009 budget request for DARPA is $3,285,569,000, an increase of $326,493,000, more than 10 percent, over the fiscal year 2008 appropriated program of $2,959,076,000. In recent years, DARPA has repeatedly underexecuted its funded program level, executing a fiscal year 2005 program that was nine percent below the appropriated program and a fiscal year 2006 program that was twelve percent below the appropriated program. Based on program execution to date, DARPA will likely continue that trend for the fiscal year 2007 and 2008 programs. While DARPA's continued underexecution can partially be explained by its fiscally responsible management approach of withholding funds from projects that fail to demonstrate progress, doubts exist about DARPA's ability to responsibly manage such a large increase. Therefore, the bill provides $3,142,229,000, a reduction of $143,340,000 from the request. The Director of DARPA is directed to provide to the congressional defense committees not later than 60 days after enactment of this Act a report that details by program element and project the application of undistributed reductions made in this Act....
The Task Force on the Future of American Innovation and the Science Coalition held a press conference this morning on “Fueling America’s Future”--the importance of federal funding for basic energy research. While both groups support a broad basic research agenda, this event emphasized the need for basic research in energy to solve America’s energy crisis. The event, held at the National Press Club, took place before a standing room only crowd. The four speakers were:
The speakers all called for an increase in funding for basic energy research and for the next President to take bold action to keep the US competitive in new technologies and discoveries in alternative energy sources. Each of the distinguished speakers brought their own take to the issue, but all spoke to the common goal of energy independence and reducing fossil fuel consumption while helping the environment.
Also featured at the event was a petition signed by over 70 organizations (including CRA) to the two Presidential candidates to focus on basic energy research in the White House to ensure America’s long-term security.
A recording of the event will be available on either the Task Force or Science Coalition website soon. We'll have the link here when it appears.
Update: Watch the full press event here.
Now that Senator John McCain has supplied his answers to the Science Debate 2008 questions, we can take a look at the similarities and differences between the two candidates on a topic that could determine the United States' competitive and economic future in the next administration. We highlighted some of Senator Obama's answers here earlier and all of the answers from both candidates can be found here. Previously in this space we have contrasted the technological agendas from each campaigns' web site.
McCain specifically calls out information technology research and computer science as important in a few of his answers. McCain says that he wants to invest in basic and applied research particularly in new and emerging areas and in information technology and will "support significant increases in basic research" at the various federal agencies -- but stopped short of saying he would fully fund the America COMPETES Act, in sharp contrast to Obama who has promised the doubling called for in that legislation. McCain also supports greater education efforts in science and math to fill the skilled jobs that are needed in an innovation economy. He particularly supports giving $250 million to states to increase participation in AP courses in math, sciences, and computer science by offering them virtually as well as supporting the STEM education programs at the various federal science agencies like DOE and NSF, a markedly different stance than the current administration.
Here are excerpts from McCain's answers to the questions that are most relevant to the computing community:
Q1. Innovation. Science and technology have been responsible for half of the growth of the American economy since WWII. But several recent reports question America's continued leadership in these vital areas. What policies will you support to ensure that America remains the world leader in innovation?"...America has led the world into this technology revolution because we have enabled innovation to take root, grow, and prosper. Nurturing technology and innovation is essential for solving the critical problems facing our country..."
"As President, I will ---
• Focus on addressing national needs to make the United States a leader in developing, deploying, and exporting new technologies;
• Utilize the nation's science and technology infrastructure to develop a framework for economic growth both domestically and globally;
• Appoint a Science and Technology Advisor within the White House to ensure that the role of science and technology in policies is fully recognized and leveraged, that policies will be based upon sound science, and that the scientific integrity of federal research is restored;
• Eliminate wasteful earmarks in order to allocate funds for science and technology investments;
• Fund basic and applied research in new and emerging fields such as nanotechnology and biotechnology, and in greater breakthroughs in information technology;
...
• Encourage and facilitate commercialization of new innovations, especially those created from federally funded research;
• Grow public understanding and popularity of mathematics and science by reforming mathematics and science education in schools;
• Develop and implement a global competitive agenda through a series of business roundtables with industry and academia leaders."
Q4. Education. A comparison of 15-year-olds in 30 wealthy nations found that average science scores among U.S. students ranked 17th, while average U.S. math scores ranked 24th. What role do you think the federal government should play in preparing K-12 students for the science and technology driven 21st Century?
"My Administration will promote economic policies that will spur economic growth and a focus on an innovative economy. Critical to these efforts is the creation of the best trained, best prepared workforce to drive this economy through the 21st century. America's ability to compete in the global market is dependent on the availability of a skilled workforce. Less than 20 percent of our undergraduate students obtaining degrees in math or science, and the number of computer science majors have fallen by half over the last eight years. America must address these trends in education and training if it hopes to compete successfully.
But I believe that education is an ongoing process. Thus our nation's education system should not only focus on graduating new students; we must also help re-train displaced workers as they prepare for the rapidly evolving economy. Invigorating our community college system is a good place to start. For example, recognizing this, I have long supported grants for educational instruction in digital and wireless technologies, targeted to minorities and low-income students who may not otherwise be exposed to these fields.
Beyond the basics of enabling every student to reach their potential, our country is faced with a critical shortage of students with specific skills fundamental to our ability to compete globally.
The diminishing number of science, technology, engineering and math graduates at the college level poses a fundamental and immediate threat to American competitiveness.
We must fill the pipeline to our colleges and universities with students prepared for the rigors of advanced engineering, math, science and technology degrees.
We must move aggressively to provide opportunities from elementary school on, for students to explore the sciences through laboratory experimentation, science fairs and competitions.
We must bring private corporations more directly into the process, leveraging their creativity, and experience to identify and maximize the potential of students who are interested and have the unique potential to excel in math and science.
We must strengthen skills of existing science and math teachers through training and education, through professional development programs and community colleges. I believe we must provide funding for needed professional teacher development. Where federal funds are involved, teacher development money should be used to enhance the ability of teachers to perform in today's technology driven environment. We need to provide teachers with high quality professional development opportunities with a primary focus on instructional strategies that address the academic needs of their students. The first 35 percent of Title II funding would be directed to the school level so principals and teachers could focus these resources on the specific needs of their schools.
I will devote 60 percent of Title II funding for incentive bonuses for high performing teachers to locate in the most challenging educational settings, for teachers to teach subjects like math and science, and for teachers who demonstrate student improvement. Payments will be made directly to teachers. Funds should also be devoted to provide performance bonuses to teachers who raise student achievement and enhance the school-wide learning environment. Principals may also consider other issues in addition to test scores such as peer evaluations, student subgroup improvements, or being removed from the state's "in need of improvement" list.
I will allocate $250 million through a competitive grant program to support states that commit to expanding online education opportunities. States can use these funds to build virtual math and science academies to help expand the availability of AP Math, Science, and Computer Sciences courses, online tutoring support for students in traditional schools, and foreign language courses.
I will also continue to support STEM education programs at NSF, DOE, NASA, and NOAA. These scientific agencies can and should play a key role in the education of its future engineers and scientists. These agencies have the opportunity to add a practical component to the theoretical aspects of the students' educational process."
Q13. Research. For many years, Congress has recognized the importance of science and engineering research to realizing our national goals. Given that the next Congress will likely face spending constraints, what priority would you give to investment in basic research in upcoming budgets?
"With spending constraints, it will be more important than ever to ensure we are maximizing our investments in basic research and minimizing the bureaucratic requirements that eat away at the money designed for funding scientists and science. Basic research serves as the foundation for many new discoveries and represents a critical investment for the future of the country and the innovations that drive our economy and protect our people. I have supported significant increases in basic research at the National Science Foundation. I also called for a plan developed by our top scientists on how the funding should be utilized. We must ensure that our research is addressing our national needs and taking advantage of new areas of opportunities and that the results of this research can enter the marketplace. We must also ensure that basic research money is allocated to the best science based on quality and peer review, not politics and earmarks.
I am committed to reinvigorating America's commitment to basic research, and will ensure my administration funds research activities accordingly. I have supported increased funding at DOE, NSF, and NIH for years and will continue to do so. I will continue my commitment to ensure that the funding is properly managed and that the nation's research needs are adequately addressed."
Senator Barack Obama responded to fourteen science questions asked by Science Debate 2008 regarding how an Obama White House would lead the US in areas vital to our competitiveness and innovation. All fourteen questions and Obama's answers in their entirety can be found here. Some highlights of most importance to the computing community include:
Q 1. Innovation. Science and technology have been responsible for half of the growth of the American economy since WWII. But several recent reports question America's continued leadership in these vital areas. What policies will you support to ensure that America remains the world leader in innovation?Ensuring that the U.S. continues to lead the world in science and technology will be a central priority for my administration. Our talent for innovation is still the envy of the world, but we face unprecedented challenges that demand new approaches. For example, the U.S. annually imports $53 billion more in advanced technology products than we export. China is now the world's number one high technology exporter. This competitive situation may only worsen over time because the number of U.S. students pursuing technical careers is declining. The U.S. ranks 17th among developed nations in the proportion of college students receiving degrees in science or engineering; we were in third place thirty years ago.
My administration will increase funding for basic research in physical and life sciences, mathematics, and engineering at a rate that would double basic research budgets over the next decade. We will increase research grants for early-career researchers to keep young scientists entering these fields. We will increase support for high-risk, high-payoff research portfolios at our science agencies. And we will invest in the breakthrough research we need to meet our energy challenges and to transform our defense programs.
A vigorous research and development program depends on encouraging talented people to enter science, technology, engineering, and mathematics (STEM) and giving them the support they need to reach their potential. My administration will work to guarantee to students access to strong science curriculum at all grade levels so they graduate knowing how science works - using hands-on, IT-enhanced education. As president, I will launch a Service Scholarship program that pays undergraduate or graduate teaching education costs for those who commit to teaching in a high-need school, and I will prioritize math and science teachers. Additionally, my proposal to create Teacher Residency Academies will also add 30,000 new teachers to high-need schools - training thousands of science and math teachers. I will also expand access to higher education, work to draw more of these students into science and engineering, and increase National Science Foundation (NSF) graduate fellowships. My proposals for providing broadband Internet connections for all Americans across the country will help ensure that more students are able to
bolster their STEM achievement.Progress in science and technology must be backed with programs ensuring that U.S. businesses have strong incentives to convert advances quickly into new business opportunities and jobs. To do this, my administration will make the R&D tax credit permanent.
Q 13. Research. For many years, Congress has recognized the importance of science and engineering research to realizing our national goals. Given that the next Congress will likely face spending constraints, what priority would you give to investment in basic research in upcoming budgets?
Federally supported basic research, aimed at understanding many features of nature- from the size of the universe to subatomic particles, from the chemical reactions that support a living cell to interactions that sustain ecosystems-has been an essential feature of American life for over fifty years. While the outcomes of specific projects are never predictable, basic research has been a reliable source of new knowledge that has fueled important developments in fields ranging from telecommunications to medicine, yielding remarkable rates of economic return and ensuring American leadership in industry, military power, and higher education. I believe that continued investment in fundamental research is essential for ensuring healthier lives, better sources of energy, superior military capacity, and high-wage jobs for our nation's future.Yet, today, we are clearly under-investing in research across the spectrum of scientific and engineering disciplines. Federal support for the physical sciences and engineering has been declining as a fraction of GDP for decades, and, after a period of growth of the life sciences, the NIH budget has been steadily losing buying power for the past six years. As a result, our science agencies are often able to support no more than one in ten proposals that they receive, arresting the careers of our young scientists and blocking our ability to pursue many remarkable recent advances. Furthermore, in this environment, scientists are less likely to pursue the risky research that may lead to the most important breakthroughs. Finally, we are reducing support for science at a time when many other nations are increasing it, a situation that already threatens our leadership in many critical areas of science.
This situation is unacceptable. As president, I will increase funding for basic research in physical and life sciences, mathematics, and engineering at a rate that would double basic research budgets over the next decade.
Sustained and predictable increases in research funding will allow the United States to accomplish a great deal. First, we can expand the frontiers of human knowledge. Second, we can provide greater support for high-risk, high-return research and for young scientists at the beginning of their careers. Third, we can harness science and technology to address the "grand challenges" of the 21st century: energy, health, food and water, national security, information technology, and manufacturing capacity.
The other twelve questions and answers are worth taking a look at as well.
Thomas G. Dolan, editorial page editor for Barron's asserts in an editorial yesterday that federal support for basic research is overrated -- what's really needed to drive innovation in this country are R&D tax cuts for American business and "permanently opening the golden door for foreign scientists and engineers." And while he's not wrong that both tax cuts and improved visa policies are probably key pieces keeping America's innovation ecosystem powering along, his understanding of the (crucial) role of basic research in that process is somewhat lacking.
Fortunately, CCC Council Chair Ed Lazowska (who is also the Bill and Melinda Gates Chair in Computer Science and Engineering at the University of Washington), has penned this response to help fill out the picture a bit. Here, with permission, is the note he sent to Barron's:
Your editorial tackles a critical issue.The steps that you focus on -- tax policy (particularly, making permanent the R&D tax credit) and immigration policy -- are important elements of a solution.
But improvements to our education system, to federal support of fundamental research, and to various policies that create "friction" in the innovation ecosystem, are equally important.
There have been several authoritative studies of how innovation actually occurs in information technology -- my own field.
Let me focus on research here, although there is just as much to say about the other elements of the innovation ecosystem. America is the world leader in IT innovation due to a complex interplay of universities, industry, and the federal government. Essentially every aspect of IT upon which we rely today - every billion-dollar sub-category of the IT industry - bears the clear stamp of federally-supported university-based research. See, for example, the figure on pages 6 and 7 of this National Academies study.
Continued investment is necessary to maintain our leadership and competitiveness. Achieving many of the "societal grand challenges" of this century will depend critically on further fundamental advances in IT: the engineering of new tools that will transform scientific discovery; advancing personalized learning; shifting towards predictive, preventive, personalized, participatory medicine; enhancing national security; developing smart controls and smart electric grids needed to address energy and climate challenges. Many of the "grand challenges" of IT itself will have broad implications for society: securing cyberspace; designing truly scalable systems; enhancing virtual reality; creating the future of networking; infusing "computational thinking" into a wide variety of disciplines which are themselves becoming "information sciences"; driving advances in entirely new approaches to computing such as quantum computing.
Research is the key to making progress on these grand challenges. Both industry and the federal government have important, but different, roles to play. It is crucial to avoid confusing the IT industry's research and development (R&D) expenditures with fundamental research that is guiding our way to the future. The vast majority of corporate R&D in IT - far more than 95% - involves the engineering of the next version of a product. This "development" is essential. But the transformative ideas - and our nation's long-term leadership - come from long-range research. It is a natural and essential role of government to support this fundamental research - R&D that looks out 5, 10, or 15 years, rather than just one product cycle. This federally-supported research takes place primarily in America's universities and has the benefit of producing not just the ideas that will power the nation and the world, but the people who will make them happen. The relatively modest federal investment in IT research has played an essential role in the past, and will play an equally essential role in the future.
=====
Ed Lazowska
Bill & Melinda Gates Chair in Computer Science & Engineering
University of Washington
http://lazowska.cs.washington.edu
The Chronicle of Higher Ed yesterday covered the release of a National Science Foundation Info Brief on the decline of U.S. funding for academic research for the second straight year, noting that NSF declares the decline "unprecedented for this data series, which began in 1972."
Though federal funding for academic research technically increased from FY 2006 to FY 2007 by 1.1 percent to $30.4 billion in constant dollars, once adjusted for inflation the "increase" actually represents at 1.6 percent decline. This follows a 0.2 percent adjusted decrease between FY 2006 and FY 2005. And, though NSF isn't reporting it yet, we already know (barring a surprising 2nd second emergency supplemental appropriations) that FY 2008 will continue that negative trend.
The Chronicle piece notes that industry's support for academic research has ramped up and actually covered most of the federal decline overall. But that was not the case in Computer Science, which still saw a decrease of 1.4 percent in academic funding from all sources. It remains to be seen how some recent highly-publicized university-industry partnerships in computing will affect FY 08 and beyond, but at this point, every little (and big) bit helps.
As the Chronicle piece also points out, it's also too soon to know how the next President might handle the situation. What we do know is that the FY 2009 appropriations bills that Congress ought to be moving in advance of the Oct 1, 2008 beginning of the fiscal year are hopelessly mired in budget politics that won't likely get resolved until post November at the very earliest (and more likely next February or later). That's more bad news for science, which was again slated for big increases in those FY 09 bills. We'll keep an eye on all developments here and keep you posted.
A couple of small announcements:
First, those of you who attended CRA's biennial conference at Snowbird last week already heard this call, but for those who didn't (or who need to be reminded), we want your research highlights! CRA and the Computing Community Consortium are in the process of gathering recent computing research highlights to feature prominently in CRA and CCC publications -- on the web, in our advocacy efforts, and in our print publications -- and we'd like yours.
What we're asking is that you add this e-mail address -- highlights@cra.org -- to any press release distribution list your department or institution may have to publicize your exciting research results. We're gathering those interesting stories, putting them into a searchable database, and then highlighting selected ones on the CRA and CCC websites. The model here is something like the very popular Astronomy Photo of the Day, where each day a new photo or graphic (or video) having something to do with astronomy is featured along with a nice succinct description. While we don't anticipate being able to feature new computing research daily, we hope to refresh it frequently enough (weekly?) to make it worth checking back often. But, to do that, we need your highlights.
To fill the pot, we're accepting any release your department or institution may have sent in the last 24 months or so. Obviously, we'd like to feature the most timely ones, but we don't mind pushing the clock back a bit for anything truly exciting. So, please submit yours today, and make sure your press offices have highlights@cra.org on their distribution lists.
In other news, we've created some new CRA-related "groups" on two popular social networking sites: LinkedIn and Facebook. Both are for those involved in, or just fans of, CRA. To join the LinkedIn one, go here and we'll approve you. On Facebook, you can find us here. We hope you'll take a look!
Last Tuesday, NYT science commentator John Tierney discussed how Congress has recently ramped up enforcement of Title IX among universities' science departments. Will a "quota system"--an idea Tierney floats in the third paragraph of his piece--be an outcome of Title IX enforcement?
So far, the increased enforcement has only consisted of periodic compliance reviews, which had been long-neglected by the NSF, Department of Energy, and NASA, according to a 2004 Government Accountability Office report. These reviews are intended to make sure grantee departments are not discriminatory.
Of course, since some fields like computer science have many more men than women--both among students and faculty--there is concern that the government might start considering everyone "discriminatory" using the yardstick of proportionality and quotas. For athletics departments, such rigorous Title IX enforcement has led to a huge increase in the participation and achievement of women athletes, but at the expense of some male sports.
The sciences are not necessarily in the same boat as sports: although most would agree that women face an uphill battle in the sciences, how much of the gap can be explained by discrimination remains an open question. "60 percent of biology majors and 70 percent of psychology Ph.D.'s" are women, raising the possibility that more women simply prefer other fields, as psychologist Susan Pinker argues.
Another possibility is that if discrimination is having any effect, most of it happens before girls reach college. One study suggests that differences at adolescence explain different outcomes 20 years later.
For now, though, the compliance reviews haven't rocked any boats. But the threat of a Title IX bludgeon hanging over departments' heads is sure to add urgency to debates about the shortage of women in fields like computer science and what to do about it.
Ed Lazowska, Chair of the Computing Community Consortium, has a passionate post today on the CCC Blog about what the latest numbers from CRA's Taulbee Survey really mean. The news is not, he points out, that computer science bachelors degrees show another year of decline -- that was completely predictable from the enrollment statistics for freshman CS majors published four years ago in the survey. The real news (as we noted back in March) is that for the first time in many years, freshman interest in CS as a major increased and enrollments have stabilized -- indicating that perhaps we may have turned a corner. What's responsible for the turnaround? According to Lazowska:
[B]y far the most important factors are (a) the job market (or people’s sense of the job market), and (b) the level of “buzz” associated with the field.Ed also talks about the experience at his institution, the University of Washington, tries to put the "crisis" in computer science in perspective by offering up some comparisons to the other science and engineering disciplines, and emphasizes the bright outlook suggested by various Dept. of Labor workforce projections (pdf). In typical Lazowska style, it's a forceful but accurate refutation of the standard story on CS enrollments we've seen for the last few years. It's definitely worth a read (and comment!) over at the excellent CCC Blog (Disclaimer: CCC is an activity of CRA, but that doesn't make it any less awesome.)Let’s start by considering graduate enrollment, rather than undergraduate enrollment. For the past 15 years, the number of Ph.D.s granted annually in computer science has been in the 900-1100 range. Suddenly, though, in the past 2 years, it has climbed to 1800. Why is this? The answer is totally obvious:
This is not a news flash — it didn’t take a genius to predict, a few years ago, that it was going to happen, and it doesn’t take a genius to explain it, either.
- In 2001, lots of startup companies went bust.
- This dumped onto the job market a number of the best bachelors graduates from a few years before, who now had two or three years of experience under their belts.
- This made it hard for some excellent new bachelors graduates of 2001 and 2002 to get the super-exciting jobs they had anticipated — they were competing with people whose academic records were every bit as good as theirs, but who also had 2 or 3 years of experience working at a hot startup.
- Because these great new bachelors graduates couldn’t get exciting jobs, they went to graduate school instead.
- And, mirabile dictu, 6 years later, they’re emerging with Ph.D.s.
Similarly for bachelors degrees. Starting in about 2002, there was lots of news about the tech bust. Tech was no longer sexy. Jobs were no longer plentiful. Subsequently, there was a lot of misleading information about the impact of offshoring. And the newspapers never bothered to report that by late 2004, US IT employment was back to the 2000-2001 level — we had fully recovered from the bust — somehow that wasn’t considered newsworthy. So it’s not surprising that interest in bachelors programs decreased sharply, and that 4 and 5 years later, the number of degrees granted precisely mirrored this decline.
Also, it’s not surprising that things are turning around. Google is hot. Tech in general is hot. There are startups everywhere. It’s clear to anyone that there are plenty of jobs. (By the way, given the incredible state of today’s bachelors job market, it doesn’t take a genius to predict that the number of Ph.D. graduates in 2014 will show a decline. When you read the scary headlines 6 years from now, remember that you heard it here first!)
Noah Shactman has an interesting post on the Danger Room Blog at Wired noting that the Pentagon has "reprogrammed" $32 million of DARPA funding, including $2 million from the Information and Communications Technology account because of DARPA's inability to attract program managers and spend the money allocated it. From the Reprogramming Action (pdf) report:
"DARPA continues to underexecute its Research, Development, Test and Evaluation programs for two reasons: first, several key program managers' positions are unfilled because there are few experts in advanced sciences and technology, and second, DARPA's approval process is delaying contract awards."If I had to guess, I'd say the latter reason might have something to do with the former, too.
It's certainly possible that the same policy changes at DARPA that have made it more difficult for university researchers to work on DARPA problems have also made DARPA a less-desireable place to spend a few years, but that's just my speculation....
[Dustin Cho is CRA's new summer fellow from the Tisdale Fellowship Program, which has been bringing college students to Washington, DC, for internships that explore current public policy issues of critical importance to the high-tech sector. Dustin is a recent graduate of Yale University with a degree in political science and an interest in the intersection of public policy and technology. After suffering through what is sure to be a tortuous summer with us here at CRA World HQ, Dustin plans to begin law school at Harvard in the Fall. Until then, expect to see plenty of his writing here on the blog as we wring all we can out of him. -- Peter]
I’ve just finished reading the RAND report, and as Peter points out, its authors take the contrarian position that U.S. science is as competitive as ever. They contend that the U.S. remains on top, and we’re not in danger of being overtaken because our R&D growth rates are pretty much the same as the rest of the world. According to RAND, there are only a few countries whose R&D growth outpaces ours, such as China and Korea, and all of these countries are starting from next to nothing (from 1993 to 2003, China only had to add $6B per year to grow at 17 percent, while the U.S. was adding more than double that amount annually and growing at 5.8 percent). Journalists’ interpretation: there’s nothing to worry about.
That’s a dramatic oversimplification, because the underlying message of the report is that we should stop looking at R&D as a horse race – and that R&D is crucial to the United States’ future, regardless of what other countries are doing.
The report argues that it’s nonsense to talk about R&D expenditures as “competition” between countries, since one country’s scientific advancements will end up increasing the standard of living for everyone in the world who can access its derivative technology. In fact, there are probably network effects to research such that increased funding actually has increasing returns – in other words, if there’s already a lot of worldwide R&D, then an extra dollar spent on research will allow another scientist to build off of other researcher’s developments, increasing every scientist’s productivity. So when other countries (or the U.S. itself) decide to invest more heavily in R&D, U.S. R&D productivity actually improves.
That said, the report also emphasizes the importance of maintaining the U.S.’s comparative advantage in R&D. Right now, it’s relatively cheaper to do science and technology research in the U.S. due to our infrastructure, labor, and funding advantages. But as Harvard economist Richard Freeman points out, if other countries (such as China) overtake us in these areas, their lower wages might actually give them the comparative advantage, thereby severely damaging the U.S. economy as we’re forced to retool our infrastructure toward different industries. Freeman thinks it’s likely poorer countries will somewhat succeed in this by specializing in certain subfields and producing a lot of highly educated researchers. But the U.S. will be better equipped to maintain its comparative advantage if we encourage immigration of skilled researchers, increase federal funding, and improve infrastructure for R&D.
The RAND report also shows that life sciences have received disproportionate federal funding, resulting in a glut of life sciences PhDs and hurting their salaries. In other S&T fields, employment demand has outstripped degree production. “The most notable instances of divergence between employment growth and growth in degrees are mathematics/computer sciences and physical sciences,” the report explains. “Mathematics/computer sciences degrees grew by 4 percent per year [from 1980 to 2000] – the highest rate of degree growth in S&E – while mathematics/computer sciences employment grew by more than twice that, 9 percent per year.”
In fact, the only reason we have comparable R&D growth rates to other countries in federal funding is due to increased life sciences funding – non-life sciences S&T growth has basically flatlined. Private investment in R&D has increased, but it’s no replacement for federally funded academic research: “Even though industrial R&D is much larger than academic research expenditures, academic spillovers increase the R&D performed by industry significantly, and have a comparable effect on patents.” The report argues that network effects from increased academic research improve the productivity of private R&D.
Since the bulk of the report examines ways to improve the United States’ R&D, it’s disappointing that media coverage (and the RAND press release itself) choose to overemphasize the counterproductive message that the U.S. is still the world leader in science and technology. Instead, shouldn’t we focus on how to keep it that way?
Two recent pieces in The Chronicle of Higher Education riff off a just-released report by the RAND Corporation to make the case that those who have argued that U.S. science and technology dominance is at risk in a globally competitive world are exaggerating.
Richard Monastersky writes in "Despite Recent Obits, U.S. Science and Engineering Remain Robust":
Although Congress, President Bush, and top university chancellors have publicly fretted about the declining health of science and engineering in the United States, a new report argues that the U.S. has maintained its supremacy in those sectors. Further, the report says, the nation should not overreact to overseas growth in technological prowess.And Daniel Greenberg writes in "Call Off the Funeral: Science in U.S. is Lively and Growing":
The RAND report stands out because gloomy findings predominate in assessments of American science. In 1985, for example, the chairman of the House Appropriations Subcommittee for the National Science Foundation expressed exasperation with the din of doom: "It’s the same argument every year, about losing the lead." In 2005, the National Research Council—the research arm of the National Academy of Sciences and its subsidiaries—issued a blockbuster compilation of R&D anxiety, "Rising Above the Gathering Storm," which still reverberates around Washington as science-policy gospel.The thing is, I'm not sure there are many within the science advocacy community who would disagree with the primary findings of the RAND report, U.S. Competitiveness in Science and Technology. The report found that the U.S. continues to be the world leader in S&T innovation; that federal support for resarch is generally up over the last decade or so -- though that increase is almost all in the life sciences, the physical sciences have been held essentially flat; there is lots of opportunity in the science and engineering workforce; and the U.S. continues to be heavily dependent on our ability to attract the best and the brightest in the world to work and study here.
Not many, if any, in the DC science advocacy community would disagree with those assessments. The concerns, of course, are the trend lines -- they are almost all trending the wrong way. (The Task Force on the Future of American Innovation has a good compilation of many of these benchmarks in their Benchmarks of Our Innovation Future II report.) Our competitors worldwide are every day increasing their capacity to compete with us -- investing in better facilities, more partnerships, increased investments in key areas -- and we're concerned the U.S. isn't matching them with anything close to the same intensity.
Gene Spafford, one of my Government Affairs Committee members, notes that these pieces also seem to give short shrift to the disruptive effect one or two key discoveries can have -- think light bulb, antibiotics, the transitor, controlled fission, the Internet, and more. Right now there is intensive research in genetics, nanotechnology, parallel computation, fusion, alternative energy and several other areas. A major advance in any one of them would be transformative on a large scale. It won't be incremental. If we're concerned about our national position as opposed to simply the advancement of science, the we want to somehow ensure that those advances happen here. And that requires having a prepared base and an active set of programs of inquiry.
The U.S. is the global leader in science and technology. It's true that the U.S. has enough of a lead at this point to "decay gracefully" (as Newt Gingrich describes it). But I'm not sure that's what most want for this country, or for their children and grandchildren who will have to live in it.
The saying is that a picture is worth a thousand words. Well, NSF and AAAS agree and are sponsoring the sixth annual Science and Engineering Visualization Challenge. There are five awards categories: Photographs/Pictures, Illustrations/Drawings, Informational/Explanatory Graphics, Interactive Media, and Non-Interactive Media. The deadline for entries is May 31.
The premise of the Challenge is that science is often communicated through visuals better than words, particularly in our web and graphics culture. Winning entries in each category will be published in Science Magazine and Science Online as well as at the NSF web site. One of the winners will be on the cover of Science Magazine’s September 26 issue.
More information and winning entries from the previous five years can be found here.
The joint investment announced yesterday by Microsoft and Intel in two university research centers (one at Berkeley and one at University of Illinois Urbana-Champaign) in order to work on solving the challenges of multi-core computing is all over the news, but there's an aspect of the story that's been hasn't been highlighted sufficiently. The NY Times' John Markoff picked up on it, however:
Both Intel and Microsoft executives said the research funds were a partial step toward filling a void left by the Pentagon's Defense Advanced Research Projects Agency, or Darpa. The agency has increasingly focused during the Bush administration on military and other classified projects, and pure research funds for computing at universities have declined.[Dan Reed is also the current Chair of CRA.]"The academic community has never really recovered from Darpa’s withdrawal," said Daniel A. Reed, director of scalable and multicore computing at Microsoft, who will help oversee the new research labs.
We've noted many, many times on this blog our concerns with policy changes at DARPA since about 2001 that have had the effect of pushing university researchers away from DARPA-sponsored research. As we wrote as recently as September 2007, shorter research horizons with an emphasis on go/no-go milestones at relatively short intervals and an increased use of classification at the agency has sharply reduced the amount of DARPA-supported research being performed in U.S. universities. In fact, between FY 2001 and FY 2004 (the last year for which we have good data), the amount of funding from DARPA to U.S. universities fell by half -- and informal evidence suggests university shares are even lower today.
While it's great news that two of the titans of the IT industry are stepping up to fill some of the gap left by DARPA's withdrawal, their $20 million investment over 5 years represents just a tiny fraction of the DARPA shortfall. The difference in DARPA funding for university computer science between 2001 and 2004 was $91 million annually ($214 million in FY 01 to $123 million in FY 04 in unadjusted dollars), and anecdotal evidence suggests that shortfall may be even larger now. The Microsoft-Intel investment is a bold move and big commitment to address a key challenge in computer science that's a primary concern for the two companies in the future. But it doesn't represent a sustainable alternative to filling the hole left in the IT R&D portfolio created by DARPA's absence.
DARPA has taken some steps to try to bring university researchers, especially younger faculty, back into the fold. In February, the agency also reorganized its IT office structure a bit -- merging the Information Exploitation Office (IXO) with the Information Processing Technology Office (IPTO) to create a new Information Processing Techniques Office (IPTO) under former IPTO Deputy Chuck Morefield. There's some indication that the office will have a technology focus (which suggests a research emphasis) in addition to a systems focus (which suggests a development-oriented emphasis), so there may be increased opportunities for university researchers to participate in DARPA-sponsored work.
We hope so, because while it's great to see the IT industry step up and make some commitments to university-led research, the country (and the DOD, and the world) is probably better served by a DARPA that's re-engaged with the university research community, supporting long-term, DARPA-hard research at a range of institutions on some of the grand challenges in computing....
Craig Barrett, Chairman of Intel, comes out swinging over the debacle that was the FY 08 Omnibus Appropriations Act and it's impact on federal support for the physical sciences, computing, mathematics and engineering, in a piece that runs today in the San Francisco Chronicle (which should get Speaker Nancy Pelosi's (D-CA) attention). The whole piece is well-worth reading, but I thought his conclusion was remarkably on point:
The United States stands at a pivotal point in our history. Competition is heating up around the world with millions of industrious, highly educated workers who are willing to compete at salaries far below those paid here. The only way we can hope to compete is with brains and ideas that set us above the competition - and that only comes from investments in education and R&D. Practically everyone who has traveled outside the United States in the last decade has seen this dynamic at work. The only place where it is apparently still a deep, dark secret is in Washington, D.C.Wow.What are they thinking? When will they wake up? It may already be too late; but I genuinely think the citizenry of this country wants the United States to compete. If only our elected leaders weren't holding us back.
There's an interesting piece running now in BusinessWeek by Microsoft Researcher Bill Buxton that capitalizes on the buzz around the concept of the "long tail" in business by arguing that there's an equally important "long nose" in business innovation that represents the long period of research and development that's required to bring innovative products to market. Here's a snip:
My belief is there is a mirror-image of the long tail that is equally important to those wanting to understand the process of innovation. It states that the bulk of innovation behind the latest "wow" moment (multi-touch on the iPhone, for example) is also low-amplitude and takes place over a long period—but well before the "new" idea has become generally known, much less reached the tipping point. It is what I call The Long Nose of Innovation.It's a great article and certainly worth reading in full.
In the piece, he mentions a chart Butler Lampson presented to the Computer Science and Telecommunications Board of the National Research Council that traced the history of a number of key technologies. That's this chart (frequently referred to as the "tire tracks" chart, for reasons that should be apparent). The chart originally appeared in a 1995 CSTB report, in which the CSTB had identified 9 billion-dollar sectors in the IT economy that bore the stamp of federally-supported research. They revised the chart in 2003 and identified 10 more sectors. I'm guessing that if they revised it again today (and I understand they are), you could at add least three more billion-dollar sectors -- "Search," "Social Networks," and "Digital Video" -- all enabled in some way by long-term research, usually supported by the federal government ... exactly the type of long-term research that got hit hardest in this year's appropriations debacle.
(Ed Lazowska's testimony before the House Government Reform committee in 2004 contains an extended riff on the chart -- how it shows the complex interplay between federally-supported university-based research and industrial R&D efforts; how industry based R&D is a fundamentally different character than university-based R&D; how the chart illustrates how interdependent the IT R&D ecosystem really is; and how university-based research produces not just ideas, but people, too. It's all under the section titled "The Ecosystem that Gives Birth to New Technologies," though the whole testimony is certainly worth a read, too.)
A Washington Post article today talks about the first petascale supercomputers expected to come online next year. The article points out the vast areas of other fields, which are assisted by computing at such a large scale including geography, medicine, and even financial markets. Here’s a sample:
The first "petascale" supercomputer will be capable of 1,000 trillion calculations per second. That's about twice as powerful as today's dominant model, a basketball-court-size beast known as BlueGene/L at the Energy Department's Lawrence Livermore National Laboratory in California that performs a peak of 596 trillion calculations per second.The computing muscle of the new petascale machines will be akin to that of more than 100,000 desktop computers combined, experts say. A computation that would take a lifetime for a home PC and that can be completed in about five hours on today's supercomputers will be doable in as little as two hours.
"The difficulty in building the machines is tremendous, and the amount of power these machines require is pretty mind-boggling," said Mark Seager, assistant department head for advanced computing technology at Lawrence Livermore. "But the scientific results that we can get out of them are also mind-boggling and worth every penny and every megawatt it takes to build them."
An interesting read and definitely worth checking out.
Questions about NSF's new $52 million Cyber-enabled Discovery and Innovation initiative? The Chronicle of Higher Education is hosting a "Brown Bag" discussion on the topic with CDI program director Sirin Tekinay on Thursday, November 8th, at noon ET. You can submit your questions now and Sirin will join the discussion on Thursday with answers.
As we've mentioned previously, the CDI initiative is a cross-Foundation initiative aimed at "[broadening] the Nation's capability for innovation by developing a new generation of computationally based discovery concepts and tools to deal with complex, data-rich and interacting systems." The $52 million initiative* will be led by CISE (which will control about $20 million), with participation from Engineering, Mathematics and Physical Science, Social, Behavioral and Economic science, Cyberinfrastructure, International Science, and EHR. NSF hopes to grow the program in successive budget years up to $250 million in 2012, with CISE controlling a proportional share. So this is potentially a very big deal.
Tune in to the chat on Thursday and learn more!
* NSF requested $52 million for the program in FY 08, and Congressional appropriators have included full funding for the program in their as-yet-unpassed appropriations bills. However, the Chronicle describes CDI as a $26 million program and I'm not sure where that number came from. In any case, the final total for FY 08 won't be known until Congress and the President sort out the mess that FY 08 appropriations has become....
Two recent Information Week articles are of interest. The first article discusses the Commission on Professionals in Science and Technology’s newly released report regarding the IT workforce and the need to increase the representation of women and minorities to keep America competitive. This was a theme at the recent conferences in Florida, the Richard Tapia Celebration of Diversity in Computing and the Grace Hopper Celebration of Women in Information Technology. The report is free and available at the CPST web site but you do have to register to access it.
The second article is about the National Research Council report encouraging open exchange of science and technology research on the international stage. The article states the Council’s understanding that there are matters of national security that the United States is trying to protect by classifying research but that “the possibility that the United States might lose its edge in technology and research represents one of the greatest risks to national security.” Again the report is available online and is worth reading.
Computerworld has fantastic coverage of the 50th anniversary of the Sputnik launch (Oct. 4th, 1957) and why, in a sense, we can thank the Soviets for helping create the conditions that led the U.S. to become the technological superpower we've become.
Computerworld's Gary Anthes' piece "Happy Birthday Sputnik! (Thanks for the Internet)" does a great job of chronicling how the federal government's reaction to the surprising Soviet launch created an agency and a research funding culture that proved so extraordinarily productive that nearly every billion-dollar sub-sector of the IT economy today bears its stamp. In the process, he checks in with a number of important figures from computer science who note that the productive culture within DARPA responsible for much of that early innovation seems to have waned -- and perhaps isn't even possible today.
Rather than quote snippets from the piece, I'd just encourage you to read all of it -- it's the piece I would've tried to write in honor of Sputnik's 50th if Anthens hadn't (I'm glad he did...it's assuredly better than anything I would've come up with).
Two other portions of the coverage are worth checking out, too. Computerworld did a pretty good job of simplifying the CSTB's "tire tracks" chart that shows the development of technologies from the initial research in university or industry labs to the time the products that resulted became billion-dollar industries.
And there's a good interview with former (D)ARPA director Charles M. Herzfeld on the state of IT research now.
It's all definitely worth a read.
Long-time readers of this blog, or anyone familiar with CRA's policy efforts, will know that we've spent a lot of time raising concerns about policy shifts at the Defense Advanced Research Projects Agency (DARPA) that have cut university participation rates in DARPA-funded computer science research. In congressional testimony and blog posts, we've pointed out that a shift at DARPA -- a focus on nearer-term efforts with an emphasis on go/no-go milestones at relatively short intervals and an increased use of classification -- has sharply reduced the amount of DARPA-supported research being performed in U.S. universities. In fact, between FY 2001 and FY 2004 (the last year for which we have good data), the amount of funding from DARPA to U.S. universities for computer science research fell by half -- and informal evidence suggests university shares are even lower today.
There are a number of reasons we're concerned about this trend. For one, DARPA's diminished support for university CS leaves a hole in the federal IT R&D portfolio -- both in funding, but maybe more importantly, in the loss of the "DARPA model" of research support. Since the early 1960s, the country (indeed, the world) has reaped the benefits of the diverse approaches to funding IT research represented by the two leading agencies -- NSF and DARPA. While NSF has primarily focused on small grants for individual researchers at a wide range of institutions -- and support for computing infrastructure at America's universities -- DARPA's approach has been to identify key problems of interest to the agency and then assemble and nurture communities of researchers to address them. The combination of models has been enormously beneficial -- DARPA-supported research in computing over the last four decades has laid down the foundations for the modern microprocessor, the internet, the graphical user interface, single-user workstations and a whole host of other innovations that have made the U.S. military the best in the world, driven the new economy, changed the conduct of science and enabled whole new scientific disciplines.
But DARPA's policy shift also impacts its own mission, which is to ensure the U.S. never again suffers the sort of technological surprise marked by the Soviet launch of Sputnik (which motivated the establishment of the agency nearly 50 years ago). DARPA's move away from support of university researchers means that many of the brightest minds of the country (indeed, the world) are no longer working on defense-related problems. This loss of mindshare -- the percentage of people working on DARPA-related problems -- is very worrisome to those in the community who understand how much of America's advantage on the battlefield (and in the marketplace) is owed to a network-centric strategy. I hear concerns from the "old guard" in many of America's top university CS departments that there's a whole generation of young researchers who have no experience working with DARPA or the Defense Department and who are not attuned to defense problems -- a fact that doesn't bode well for the future of the U.S. technological advantage and DARPA's goal of preventing technological surprise.
To their credit, the folks at DARPA recognize that this lack of awareness among younger faculty of the types of problems DARPA would really like to solve is a situation that needs addressing. And one way they're approaching the problem is very direct -- they're finding young faculty with research areas of interest to the agency and, well, taking them on a little tour of the DOD. The Computer Science Study Group, run by the Institute for Defense Analysis for DARPA, serves to "acclimate a generation of researchers to the needs and priorities of the DOD," by mentoring, holding workshops, field trips to DOD facilities and fairly elaborate (and pretty kewl) show-and-tells. An interesting article today on Rensselaer ECSE professor Rich Radke's experience has some details on CSSG goals and methods:
The multi-year program familiarizes up-and-coming faculty from American universities with DoD practices, challenges, and risks. Participants are encouraged to view their own research through this new perspective, and then to explore and develop technologies that have the potential to transition innovative and revolutionary computer science and technology advances to the government.Read the whole piece for details of his adventures."The basic idea is to expose young faculty to Department of Defense-related activities, via briefings by military and intelligence officers and ‘field trips’ to military and industrial bases," Radke said. "It is truly a hard-core experience filled with days of interesting briefings and up-close show-and-tell with vehicles and equipment."
2007 was the first year for the CSSG and the $4.5 million program supported about a dozen young researchers. DARPA has requested an increase in the program for FY 08 ($7 million) and FY 09 ($7.7 million), so hopefully we'll see that number start to rise.
The DARPA CSSG program is one part of addressing the overall problem. The larger concern is the importance of bringing DARPA back into the university research fold -- not because it would benefit academic researchers, but because it impacts the mission success of the Department of Defense (and hence our national security). A number of factors suggest that maybe it's time to focus on the goal of increasing mindshare of the best brains working on U.S. defense-related problems. For one, because of U.S. visa policies, increasingly the best minds in the world won't necessarily be coming to the U.S. Second, the research capacity of our potential adversaries increases daily. And finally, the increase in foreign investment in U.S. university research departments means that competition for U.S. university mindshare is only increasing, and in some cases, maybe from countries we'd rather not gain a competitive leg-up on us. So, programs like CSSG are really important. But maybe so are some bigger policy issues across the agency....
Forbes.com has an interesting article about a survey on the role of women in patents. The survey (PDF), from the National Center for Women & Information Technology, shows that patents by mixed-gender teams are cited more often than those of single-gender teams.
Not a lot of new information in the article but it points out something that CRA and NCWIT have been saying for a long time: a diverse workforce is an asset to American business.
"Our data show that diversity of thought matters to innovation," says NCWIT Chief Executive Lucinda Sanders, who holds six telecom software patents. "We can say involving women is important because women are half the population and have good ideas, but our study shows the impact for companies."
It’s worth a read.
John Schwartz of the New York Times has an interesting piece today on the rise in complexity of networked applications and the risks that complexity poses. Headlined Who Needs Hackers?, the piece makes the point that the biggest threat to these systems isn't malicious users, but complexity itself. Understanding how these giant interconnected systems work (or not) is a great challenge for the community.
"We have gone from fairly simple computing architectures to massively distributed, massively interconnected and interdependent networks," [Andreas M. Antonopoulos, a founding partner at Nemertes Research] said, adding that as a result, flaws have become increasingly hard to predict or spot. Simpler systems could be understood and their behavior characterized, he said, but greater complexity brings unintended consequences.By the way, addressing this challenge is one of the goals of those proposing the Global Enivronment for Networking Innovations research network that we've discussed before in this space."On the scale we do it, it's more like forecasting weather," he said.
It's done! It's done! By now, I expect that everyone has heard that both the House and Senate have agreed on the conference report for H.R. 2272, The America COMPETES Act and that the measure is headed to the President for his signature.
Word comes from the White House today that the President will sign the bill in a small signing-ceremony tomorrow with the Members of Congress who were instrumental in moving the bill along. While it's a bit of a bummer that the President isn't making a big "to-do" about this with representatives from industry and academia and lots of press -- it does, after all, enact many portions of his own American Competitiveness Initiative, and it's also an issue that polls really well, a fact you'd think would be important to both a Congress and a President who could use a few good examples of positive, bi-partisan legislation to show off -- the important thing is it's getting signed. After nearly two years of wrangling over this particular set of proposals -- and a lot longer than that to get the Administration and the Congress to understand the import of the problems addressed -- the President will sign the bill and its provisions will be law.
That deserves some kudos, back-patting, and maybe one or two loud "whoo-hoo's."
Especially because this bill has a lot of good things in it. As Cameron Wilson points out over on the USACM Technology Policy Blog, the bill takes two basic routes to fostering the innovation the country will require to stay competitive in an increasingly global world. It addresses federal support for research -- both authorizing large amounts of new funding for three key science agencies (National Science Foundation, NIST, and the Department of Energy's Office of Science), setting a target to double the agencies budgets over 7 years, and by creating a new high-risk research agency at the Department of Energy (called the Advanced Research Projects Agency - Energy, or ARPA-E, in a nod to the DARPA-like character Congress hopes the agency will adopt). And the bill addresses a diversity of Science, Technology, Engineering and Mathematics (STEM) Education efforts. For these, I'll simply steal what Cameron has already written:
The bill authorizes $43.3 billion over the next three fiscal years for STEM education programs across the federal government. The variety is impressive ranging from new k-12 teacher programs to new opportunities for undergraduate and graduate STEM students. Here is a sampling of the proposals:In addition, the bill contains two particular provisions I wanted to highlight because they're of particular interest to the computing community:In addition, the legislation has several provisions that expand outreach to women and minorities in STEM fields. The lack of females and minorities has been a key problem in computing, so this is another welcome effort.
- Expands the Robert Noyce program which links students in STEM fields up with education degrees so they can teach STEM in K-12;
- Authorizes two new competitive grant programs that will enable partnerships to implement courses of study in mathematics, science, engineering, technology or critical foreign languages in ways that lead to a baccalaureate degree with concurrent teacher certification;
- Authorizes competitive grants to increase the number of teachers serving high-need schools and expand access to AP and IB classes and to increase the number of qualified AP and IB teachers in high-need schools; and,
- Expands early career grant programs and provides additional support for outstanding young investigators at both NSF and DOE.
The first is Section 7024, "High-performance Computing and Networking" (if you're following along at home (pdf)) -- the inclusion of the High-Performance Computing Research and Development Act that has been much discussed on these pages since some of the earliest days of this blog. The bill has been proposed in various forms in every session of Congress since the 106th (we're now in the 110th) and has never gained the full approval of the Congress -- almost always for reasons unrelated to the bill. The bill has, in sessions past, been approved by the House only to languish in the Senate due to jurisdictional fights over other bills, approved by the House Science committee only to run afoul of budget disputes with the GOP Leadership, and been held hostage over fights about NASA between the House and Senate. In fact, until the approval of the conference report last week, it was assumed that this version HPC R&D Act might meet a similar fate as word escaped that some of the Senate conferees thought its inclusion might cause some jurisdictional friction between two Senate committees (Energy and Commerce, who both claim HPC jurisdiction). But those problems were resolved, and the bill includes the full House-approved language, plus an extra section that authorizes efforts in "Advanced Information and Communications Technology Research" at NSF, including research on:
Otherwise, the HPC R&D Act remains essentially unchanged, which means it includes two provisions we particularly like: it requires the White House Office of Science and Technology Policy to develop and maintain a research, development, and deployment roadmap for the provision of federal high-performance computing systems; and there's now an explicit requirement that the President's advisory committee for IT (now PCAST) review not only the goals of the federal Networking and Information Technology Research and Development program, but the funding levels as well and report the results of that review to Congress every two years.
The second noteworthy provision in the COMPETES bill is one (Sec. 7012) that was originally included in the House-passed NSF Authorization Act of 2007 (H.R. 1867), that should help clarify NSF's role in supporting efforts that seek to encourage the participation of women and underrepresented groups in computing, science, technology, engineering and mathematics. As we noted back in March, this is a response to long-standing concerns from CRA and other members of the computing and science communities about NSF's role. Basically, NSF's general policy is to only support efforts that represent novel approaches. Yet, what's often needed in these cases isn't a novel approach, just a sustained one. The House Science and Technology Committee agreed and included language in the NSF Authorization that addresses the issue by allowing the Director of NSF to review such programs one year before their grants expire and issue extensions of up to three years without recompetition to those efforts that appear to be successful at meeting their stated goals. It also emphasizes that the committee believes this sort of effort -- maintaining the strength and vitality of the U.S. science and engineering workforce -- is appropriately part of the agency's mission. So, we're thrilled that the provision survived the conference and will become law with the President's signature tomorrow.
This is, of course, not the end of innovation efforts in the Congress or the Administration. While this bill sets nice, juicy funding targets for NSF, NIST and DOE Office of Science, it doesn't actually appropriate a single dime, so the focus will continue to be on House and Senate appropriators as they wind their way through the appropriations process later this year. We're still expecting a meltdown in that process, so nothing is guaranteed, despite all the supportive words from Congress and the President. And there will be further efforts to address some of the pieces of the various innovation agendas that aren't represented in H.R. 2272 -- like a permanent extension of the R&D tax credit.
But for now, I think it's probably appropriate to take a deep breath and savor this win for a day or two. This is a big victory for the science community and a long-time coming for those of us who have been working these issues around the Hill over the better part of the last decade. We commend the President and the Congress for having the vision and the commitment to push ahead on these issues, even when it didn't seem as politically popular as it is today. And we commend the members of the science community for speaking up on these issues, serving on the advisory committees, and partipating in the grassroots efforts to make Congress aware of the issues. Now, just make sure you go out and do world-leading science -- take risks, think audaciously...demonstrate as you've done so well in the past why America needs to continue to be an incubator for invention, discovery, and innovation.
And keep it tuned here for all the details... :)
Update: (8/9/07) -- It's official!:
President George W. Bush signs H.R. 2272, The America Competes Act, Thursday, Aug. 9, 2007, in the Oval Office. Pictured with the President are, from left: Director John Marburger of the Office of Science and Technology Policy; Senator Jeff Bingaman of N.M.; Congressman Bart Gordon of Tenn.; and Senator Pete Domenici of N.M. White House photo by Chris Greenberg
Update2: (8/10/07) -- Here are the President's comments about the bill and ACI, as well as an OSTP-produced fact sheet.
The National Science Foundation has published two reports on American research and the decline of journals publishing it. The reports show that beginning in 1992 journals began to publish less American based research with a corresponding rise in research from Europe and Asia being published. In 1992, the share of American research published in journals was 37 percent and in 2003 it was 30 percent. The reports give a number of possible reasons for the decline, including the increase in scientific research being performed in Europe and Asia as well as more international collaboration on research in all fields.
Both reports are interesting and worth a read along with an article about them in Inside Higher Education. A third report on the topic is planned.
The Chronicle of Higher Education (sub. req’d.) has a great article on the future of the Internet and the Global Environment for Network Innovations or GENI. It contains quotes from many participants of the new Computing Community Consortium (CCC) that CRA helped launch.
The article talks about the problems with the current state of the Internet:
Identity theft, viruses, and attacks on Web sites are on the rise — a few weeks ago the country of Estonia was practically shut down, digitally, by deliberate attempts to jam government computers. Spam, which was less than 50 percent of e-mail traffic back in 2002, is now close to 90 percent, according to Commtouch Software Ltd., an Internet-security company.Moreover, the Internet has great difficulty coping with the sharp increase in mobile devices like cellphones and laptops, and handling bandwidth-hungry traffic such as video, now demanded by an increasing number of users.
GENI and its possibilities are discussed in great detail:
The people pushing for change are the very people at universities and colleges who built the Internet in the first place. Researchers at the Massachusetts Institute of Technology, the University of California at Berkeley, and the University of Southern California, among others, have joined Mr. Peterson in the GENI planning process. Industry players such as chip-maker Intel are also on board.…
In late May of this year, the science foundation awarded Cambridge-based BBN Technologies the job of planning GENI, giving them $10-million to spend over the next four years. The company has deep roots in the old Internet: It built the first network segment connecting four universities back in 1969.
Chip Elliott, the BBN engineer who will be running the GENI project office, thinks the project calls for two approaches. "First, if you don't like conventional Internet protocols, try something completely different. Second, do it on a large enough scale, with enough users, so that your results actually mean something." People associated with GENI say that "large enough" means access for experimenters at several hundred universities and, eventually, a user community in the tens of thousands.
Thousands of users will provide a crucial dose of reality, say planners. Over the years, there have been many papers published on new Internet design, and simulations run on networks such as PlanetLab. "But you don't know how an Internet design will behave until a large group of people actually use it," says Ms. Zegura, who will co-chair a GENI science council charged with rounding up ideas from the research community. "They will do things that you don't expect, just like in the real Internet, and then you'll see how robust your idea is. That's where the rubber meets the road."
The computing community has had these concerns for quite a while, so it's not surprising to see other disciplines noting similar issues with DARPA in this OpEd written by David Ignatius in Friday's Washington Post:
DARPA once liked to boast that it took on impossible problems and wasn't interested in the merely difficult. But in recent years, the scientists argued, DARPA has become nearly as cautious and prone to micromanagement as the government's science behemoth, the National Institutes of Health. Before making most of its grants, the NIH demands such detailed evidence of success that it is "funding the past, not the future," one scientist complained.For more background, in addition to all the links above, be sure to check out CRA's Information Technology R&D page which has tons of links to previous press reports on the issue...."DARPA seems to be shifting to the NIH model -- more near-term, more risk-averse," said Don Ingber, a professor of pathology at Harvard.
David Broder writes about the America COMPETES Act in his column today at the Washington Post. It contains this great quote from Sen. Lamar Alexander (R-TN), one of the sponsors of the Act:
"Last week," he said, "while the media covered Iraq and U.S. attorneys, the Senate spent three days debating and passing perhaps the most important piece of legislation of this two-year session. Almost no one noticed."Read the whole thing.Alexander has a point. The bill, boldly named the America Competes Act, authorized an additional $16 billion over four years as part of a $60 billion effort to "double spending for physical sciences research, recruit 10,000 new math and science teachers and retrain 250,000 more, provide grants to researchers and invest more in high-risk, high-payoff research."
Interesting article (requires free registration) on the innovation agenda in the San Jose Mercury News. While it does focus mostly on the energy and environmental areas that could be helped, it also touches on almost all aspects of the overall innovation agenda such as funding basic research and increasing STEM K-12 teachers. There is also a good quote from Sen. John Ensign (R-NV) who said, "I'm a fiscal conservative, but the dollars we invest in basic research will come back to us in spades in terms of stimulating economic activity and helping the United States remain at the forefront of global innovation."
Time Magazine has a pretty decent piece on NSF's Global Environment for Networking Innovations program, which has the goal of "[enabling] the research community to invent and demonstrate a global communications network and related services that will be qualitatively better than today's Internet."
Although it has already taken nearly four decades to get this far in building the Internet, some university researchers with the federal government's blessing want to scrap all that and start over.We've covered the progress of GENI previously in this space, including the most recent announcement by the Computing Community Consortium (CCC) of the naming of the initial members of the GENI science council. As it stands now, GENI is a "Horizon" project in NSF's 2007 Facilities Plan -- a step away from "Readiness Stage," which would allow for extensive pre-construction planning. There are currently 10 projects listed in the plan as "Horizon" projects, and just one in the "Readiness Stage" for FY 2008 (the Advanced Technology Solar Telescope). For FY 2008, NSF has requested $20 million to ramp up GENI pre-construction planning -- so the program is moving forward, but still has some distance to go before it's ready to be included in the queue of projects being considered for the "Major Research Equipment and Facilities Construction" account in future budget years.The idea may seem unthinkable, even absurd, but many believe a "clean slate" approach is the only way to truly address security, mobility and other challenges that have cropped up since UCLA professor Leonard Kleinrock helped supervise the first exchange of meaningless test data between two machines on Sept. 2, 1969.
The Internet "works well in many situations but was designed for completely different assumptions," said Dipankar Raychaudhuri, a Rutgers University professor overseeing three clean-slate projects. "It's sort of a miracle that it continues to work well today."
No longer constrained by slow connections and computer processors and high costs for storage, researchers say the time has come to rethink the Internet's underlying architecture, a move that could mean replacing networking equipment and rewriting software on computers to better channel future traffic over the existing pipes.
Even Vinton Cerf, one of the Internet's founding fathers as co-developer of the key communications techniques, said the exercise was "generally healthy" because the current technology "does not satisfy all needs."
The New York Times has a nicely-written obituary for computing pioneer Ken Kennedy, penned by John Markoff. Here's a snippet:
A member of the generation of researchers who were the first to have access to modern supercomputers, Mr. Kennedy spearheaded early work on software programs known as parallelizing compilers, systems that can automatically spread workloads among a large number of processors, vastly speeding calculations.Kennedy also played an important role on the first incarnation of the President's Information Technology Advisory Committee (PITAC), which put together the 1999 Information Technology Research: Investing in our Future report. The strong, well-supported recommendations in that report helped pave the way for a dramatic expansion of the federal government's support for computer science research. Kennedy was also a co-PI on CRA's Computing Community Consortium proposal, which was ultimately successful.Early computers were based on a single processor that would perform the steps of a software program sequentially. But in the 1970s and 1980s researchers began to look for ways to increase computing speed by harnessing tens, hundreds and even thousands of processors, in much the fashion that adding lanes to a freeway will allow more traffic to flow.
The challenge that such systems presented was the need to create programming tools that would hide the interdependencies and complexity from the scientists and engineers who wanted to use the machines as problem-solving tools.
"These compilers made it possible for mere mortals to write advanced programs," said Edward Lazowska, the Bill and Melinda Gates professor of computer science at the University of Washington in Seattle. "Ken was the No. 1 person in parallel compiling." (Parallel compilers are software programs that translate programmers’ language-oriented instructions into numeric codes that control computer operation.)
The software technology he developed has served as the foundation for successive generations of scientists and engineers who developed advanced simulations, including weather and climate prediction and the model of automobile collisions. Moreover, the fruits of his technology are now rapidly reaching broad consumer audiences both through the latest generations of personal computers and through videogame players, which now come equipped with parallel processors.
I was privileged to have a few interactions with Kennedy over the six years or so I've been at CRA and was always impressed with his grasp of policy and his willingness to do more than was necessary in service of the field.
Update: Chuck Koelbel from Rice passed along these additional details:
A memorial service for Dr. Kennedy will take place at First Presbyterian Church, 5300 Main Street, Houston, on Thursday, February 15 at 3pm. In lieu of flowers, the family suggest gifts be made to Rice University, Ken Kennedy Memorial Fund. Checks may be mailed to Rice University MS-81, P.O. Box 1892, Houston, TX, 77251-1892. To contribute online, visit giving.rice.edu, click "Make a Gift Online", choose "Designation-Other, and type "Ken Kennedy Memorial Fund" in the Special Instruction box.
Today's Washington Times features an OpEd from two champions of science from opposite sides of the aisle: Former House Speaker Newt Gingrich and the new Chair of the House Committee on Science and Technology, Rep. Bart Gordon (D-TN). The piece was motivated by the recent Task Force on the Future of American Innovation report, (covered previously) which calls for a strong federal investment in fundamental research in order to help preserve the Nation's economic leadership and ensure our continued security. Gingrich participated in the roll-out event for the Benchmarks report and was quite eloquent on the national security implications of basic research, themes he and Gordon return to in this OpEd:
Throughout history, national security has been dependent on economic prosperity, and visa versa. An economically strong America is better able to defend itself. Likewise, the nation's ability to defend itself is a prerequisite to maintaining the infrastructure and other elements of a strong national economy.The piece is very well-timed, given the current deliberations on the stalled FY 2007 Appropriations process and the President's forthcoming State of the Union Address. Its bipartisan authorship highlights the bipartisan support for fundamental research in Congress. With a flood of new Members of Congress in Washington, and "old" Members with new positions of responsibility, this is a drum that will need continuous beating in the coming months...as we try to make up for the painful stumbles late after a year of fantastic progress.Unfortunately, the nation has forgotten one of the most important ways our economic prosperity and national security are linked — investment in fundamental scientific research. Investments made in fundamental scientific research after World War II and during the Cold War have been essential to making our fighting men and women today the best equipped in the world. These previous investments and the new knowledge they generated also made enormous contributions to our economic vitality.
But our commitment to that defense-oriented fundamental research — the kind of research that pays off not in a year or two but in the long run, sometimes decades in the future — has eroded. If we do not renew this commitment, it will harm our global economic competitiveness as well as the effectiveness and safety of our troops.
Read the whole piece.
The Sunday New York Times featured an article on the impact of the continuing resolution on science research. The article starts:
The failure of Congress to pass new budgets for the current fiscal year has produced a crisis in science financing that threatens to close major facilities, delay new projects and leave thousands of government scientists out of work, federal and private officials say.
It touches on a number of agencies, programs, and labs that are hurting and facing possible discontinuation. Regarding NSF it states:
The National Science Foundation, which supports basic research at universities, had expected a $400 million increase over the $5.7 billion budget it received in 2006. Now, the freeze is prompting program cuts, delays and slowdowns."It's rather devastating," said Jeff Nesbit, the foundation's head of legislative and public affairs. "While $400 million in the grand scheme of things might seem like decimal dust, it's hugely important for universities that rely on N.S.F. funding."
The threatened programs include a $50 million plan to build a supercomputer that universities would use to push back frontiers in science and engineering; a $310 million observatory meant to study the ocean environment from the seabed to the surface; a $62 million contribution to a global program of polar research involving 10 other nations; and a $98 million ship to explore the Arctic, including the thinning of its sheath of floating sea ice.
A number of quotes are included but one that sums up the thoughts of most of the community is from Mike Lubell at the American Physical Society, a fellow member of the Task Force on the Future of American Innovation.
"The consequences for American science will be disastrous. The message to young scientists and industry leaders, alike, will be, ‘Look outside the U.S. if you want to succeed.’ "
The new chair of the Senate Finance Committee, Max Baucus (MT), introduced legislation on the first day of the 110th Congress to permanently extend the R&D tax credit an article in Congressional Quarterly stated (Sub Req'd).
The R&D tax credit has always been a priority of the high-tech community on Capitol Hill and there have been previous attempts to make it permanent rather than extending it each year. However, the cost of such a permanent tax credit has generally kept this from happening. After much wrangling (most not related to the merits of the R&D tax credit itself, but rather the other tax proposals it was packaged with), the 109th Congress passed the R&D tax credit for 2006 retroactively in the waning days of the session in December and included the extension through 2007.
The CQ article also states that competitiveness issues will be a priority for Baucus as the chair of the Finance Committee. We certainly hope that is true, not just for the Finance Committee, but for all of Congress.
As mentioned previously in this space, the Task Force on the Future of American Innovation held a press conference for the release of the Benchmarks II report on Thursday. Former Speaker of the House Newt Gingrich, David Abshire, President of the Center for the Study of the Presidency, and Larry Wortzel, Chairman of the U.S.-China Economic and Security Review Commission and Vice President for Foreign Policy at the Heritage Foundation addressed a full house of Congressional staff, reporters, and other interested members of the DC crowd. This year’s Benchmarks report, called “Measuring the Moment: Innovation, National Security, and Economic Competitiveness,” focused more on defense and homeland security related research than the previous report.
National Journal’s Technology Daily and GovExec.com both ran an article on the event and report. A bit from the article:
A group of high-tech leaders and national security experts is asking President Bush to include basic defense research in his American competitiveness initiative.The Task Force on the Future of American Innovation backed the request Thursday with a new report that warns that while funding for military research and development is at a record high, recent increases have focused on applying existing ideas to new weapons and equipment.
"We have been under-investing in the basic research needed for the next-generation military technology," the report warned. The task force was formed in 2004 to advocate for more federal support for research in the physical sciences and engineering…
Former House Speaker Newt Gingrich said the long-term goal should be not just combating terrorism but leading in science by investing in national security advances. "Otherwise we'll have opponents that have scientific capabilities we don't understand," Gingrich said.He added that his biggest mistake as House speaker in the mid-1990s was not also tripling the National Science Foundation budget when Republicans doubled the National Institutes of Health budget.
We’ll keep you updated on the Task Force’s activities, press coverage of the report, and any impact it might have moving forward as we work with the Congress through the end of the year and into the next budget cycle.
A PDF of the Benchmarks II report can be found here.
NSF Director Arden Bement encouraged colleges and universities to expand high speed networking tools as a path to innovation in a speech to The Chronicle of Higher Education’s Technology Forum yesterday. The Chronicle article on the speech is available for free here for the next five days and then to subscribers only here.
A couple highlights from the speech and article:
"Leadership in cyberinfrastructure may well become the major determinant in measuring pre-eminence in higher education among nations," he said. "Indeed, to be even more provocative, I would suggest that leadership in cyberinfrastructure may determine America's continued ability to innovate -- and thus our ability to compete successfully in the global arena...."
Mr. Bement said cyberinfrastructure was a "comprehensive phenomenon that involves creation, dissemination, preservation, and application of knowledge." He said it was not just about building new networking tools, but new "norms of practice and rules, incentives, and constraints that shape individual and collective action."
In the previous entry, I mentioned that the Task Force on the Future of American Innovation (of which CRA is a member) was planning an event on November 16th to release its "Benchmarks II" report and press Congress to finish its good work on funding the President's American Competitiveness Initiative. Well, we can now share some details about it. Should be a good event!:
** MEDIA ADVISORY **Watch this space for all the details....WHO:
- Newt Gingrich, former Speaker of the House
- David Abshire, President, Center for the Study of the Presidency, former Special Counsel to President Reagan and former Ambassador to NATO
- Larry Wortzel, Chairman, U.S.-China Economic and Security Review Commission and Vice President for Foreign Policy, Heritage Foundation
- Rep. Jim Cooper (D-TN), House Armed Services Committee.
WHAT:
1. Participants will challenge the Administration and Congress to provide greater Defense Department funding of basic research.
2. Participants will support full funding of President Bush's American Competitiveness Initiative.
3. Release of the 2006 Benchmarks Report of the Task Force on the Future of American Innovation. To read 2005 report, go to http://futureofinnovation.org/PDF/Benchmarks.pdf (pdf).WHERE:
Reserve Officers Association
One Constitution Avenue, NE
5th Floor Conference Room
WHEN: Thursday, November 16, 11:00 AM to Noon- # # # -
The Task Force on the Future of American Innovation (www.futureofinovation.org), comprised of organizations from industry and academia, advocates increased federal support for research in the physical sciences and engineering.
Formed in 2004, the Task Force urges strong, sustained increases for research budgets at the National Science Foundation, Department of Energy Office of Science, National Institute of Standards and Technology, and Department of Defense.
For more information, to RSVP, or to request an embargoed copy of the report, please contact:
Anne Caliguiri
202.682.4443
anne_caliguiri@aeanet.orgBarry Toiv
202.408.7500
barry_toiv@aau.edu
Steve Lohr has a great piece today in the NY Times on the state of CS, called "Computing, 2016: What Won't be Possible?" The essay was apparently spurred by last week's CSTB's 20th Anniversary symposium, which I regret that I couldn't attend. (Fortunately Cameron and David from ACM's U.S. Public Policy Committee did and have some great write-ups.)
Here's a snippet from the NY Times piece:
Computer science is not only a comparatively young field, but also one that has had to prove it is really science. Skeptics in academia would often say that after Alan Turing described the concept of the “universal machine” in the late 1930’s — the idea that a computer in theory could be made to do the work of any kind of calculating machine, including the human brain — all that remained to be done was mere engineering.Glad to see that the CSTB event succeeded in getting the message across that computing is a discipline still rich with challenges and contributions to make. Let's hope this piece gets as wide a circulation (and has as big an impact) as this previous NY Times piece....The more generous perspective today is that decades of stunningly rapid advances in processing speed, storage and networking, along with the development of increasingly clever software, have brought computing into science, business and culture in ways that were barely imagined years ago. The quantitative changes delivered through smart engineering opened the door to qualitative changes.
Computing changes what can be seen, simulated and done. So in science, computing makes it possible to simulate climate change and unravel the human genome. In business, low-cost computing, the Internet and digital communications are transforming the global economy. In culture, the artifacts of computing include the iPod, YouTube and computer-animated movies.
What’s next? That was the subject of a symposium in Washington this month held by the Computer Science and Telecommunications Board, which is part of the National Academies and the nation’s leading advisory board on science and technology.
You can read all of Lohr's piece today here.
A few interesting pieces/tidbits to juxtapose this morning. Sam Liles helpfully forwarded this piece from The Tennessean on the declining interest in computer science as a major, which is apparently getting a fair bit of play on digg.com. The article asks the now familiar question:
Computer science majors make some of the highest starting salaries for college graduates in the country, at about $50,000 a year. Computer science and computer engineering jobs are some of the fastest-growing occupations in the nation, according to the U.S. Department of Labor.The article puts the finger on student's perceptions about the state of the job market -- that potential majors shy away from CS because of fears about offshore outsourcing. But it also does an "ok" job of showing how that might be a mistaken impression:So why are university computer science departments watching their enrollments slide?
The East South Central region, which includes Tennessee, Kentucky, Alabama and Mississippi, is the fastest growing in the country in terms of information technology jobs, in part because of economic growth here, according to her agency's latest survey.But students' perceptions of the job market aren't the only aspects of the problem worth addressing. Increasingly, CS departments are realizing that the way they teach computer science might have something to do with declining interest in their major, too. And that's the focus of this piece in today's Inside Higher Ed, "New 'Threads' for Computer Science." The piece (which must be good because it quotes my boss, Andy Bernat, and CRA Board Member Rich DeMillo) focuses on the announcement of planned curriculum changes in the College of Computing at Georgia Tech, where DeMillo is Dean.Some 23 percent of chief information officers in that region plan to hire more workers this year and only 1 percent plan cutbacks.
Movva said she hasn't been able to find experienced consultants in Nashville, and has had to hire outside the region, including signing visas for foreign nationals, to fill job openings.
"There are lots of jobs but not enough people are entering this field,'' said Sandeep Walia, who is opening an e-commerce software office called Ignify on West End Avenue.
With Oracle database experts making as much as $150,000 a year, "you wonder why more people aren't getting into this,'' Walia said.
Vanderbilt professors are worried about the perception that jobs aren't out there.
The department's Web site includes a plea from the chairman to prospective students that says: "Contrary to what you may be reading in some publications, there are jobs. …
"The jobs are out there, but the perception is that they're not,'' said Richard Detmer, the chairman of the computer science department at Middle Tennessee State University.
Jonathan Waite graduated with a bachelor's degree from Vanderbilt in May. But he says the job market is saturated with computer scientists. He feels that way even though he got three job offers in three months of looking for a job.
The Georgia Institute of Technology is today unveiling what some experts believe is a much broader approach to the problem. The institute has abolished the core curriculum for computer science undergraduates — a series of courses in hardware and software design, electrical engineering and mathematics. These courses, in various forms, have been the backbone of the computer science curriculum not just at Georgia Tech but at most institutions.The Georgia Tech approach is noteworthy, not just because it's an interesting approach to the problem, but because -- as Andy points out in the article -- it's being undertaken by one of the bigger schools in computing. There's plenty of additional detail on Georgia Tech's approach in the article and on the Georgia Tech website.In their place, Georgia Tech is introducing a curriculum called Threads.
...
Underlying this approach is the view that "the one size fits all approach to computer science just isn’t working anymore," said Richard A. DeMillo, dean of the College of Computing at Georgia Tech. The plans were developed by professors, who prepared a white paper outlying how this approach would create "symphonic thinking" graduates — another way of saying graduates whose jobs wouldn’t be outsourced, a fear keeping many out of the field.
"The really big change here is that we were willing to give up the idea of a core curriculum," said DeMillo. "If you have 90 percent of your courses occupied with the core, you don’t have the flexibility to do anything creative."
Additional efforts in improving the quality of CS education will likely be give a boost by NSF's very recent solicitation for its new CISE "Pathways to Revitalized Undergraduate Computing Education" (CPATH) program. The new program will make $6 million in awards in FY 2007 to encourage "colleges and universities to work together, and with other stakeholders in undergraduate computing education including industry, professional societies and other types of organizations, to formulate and implement plans to revitalize undergraduate computing education in the United States."
While the image of computing still requires a lot of work, it's also becoming increasingly clear that the field needs to reexamine the way it educates its undergraduates. In the coming months, I think we'll see further efforts by the various computing societies (including CRA) to put a focus on CS education. Hopefully the NSF solicitation will uncover some interesting ideas and approaches within the discipline as well.
The Council on Competitiveness will unveil a "new study regarding public-private partnerships that leverage supercomputing resources funded by the federal government for greater industry strength" on September 7 during the Third High Performance Computing Users Conference. The announcement can be found on HPC Wire and we'll post more details once the study is released.
Last June, CRA joined with over 300 other science and university groups in filing comments (pdf) opposing the Department of Commerce's proposed change to so-called "deemed export" regulations that would seriously impact university research efforts. A deemed export occurs when a foreign national "uses" technology subject to export restrictions while in the United States. The proposal would have made a number of significant changes:
The Department of Commerce has apparently listened to the community in opposition and decided to step back from it's proposed rule. A Bloomberg story with some of the detail is here. This is my favorite quote:
"I came to the conclusion it was a much sounder approach to actually think about the overarching policy and revisit basic assumptions and revisit objectives," said [David] McCormick, [U.S. undersecretary of Commerce for industry and security].Read the whole thing.
Here's CRA's original coverage of the proposed rule and our filed comments (pdf).
A nice win for the science community....
Sen. John Sununu (R-NH), known as one of the biggest budget hawks on the Hill (in fact, he's the highest ranked "taxpayers' friend" in the Senate, according to the National Taxpayers Union) has his take on the current push for competitiveness legislation in today's Washington Times. While it's not surprising that he sees lots of "waste" when he looks at the competitiveness bills currently floating around the Senate, it's encouraging that the essence of his Op-Ed is that the federal government's real role in advancing competitiveness is in supporting fundamental research. Here's a liberally-quoted bit from the piece (no pun intended):
As this debate moves forward, any legislation designed to promote American competitiveness and innovation should adhere to the following rules to ensure that American taxpayer dollars are not wasted or misused:While we could quibble with a lot of that -- the difference between "basic" and "applied" research is often not so cut and dried as he implies, authorizing NSF's doubling sent an important signal, etc -- it's hard to imagine getting a more favorable endorsement from a fiscal conservative of the portions of the ACI we care most about. It's certainly a more thoughtful response to the President's plan than a recent conservative think tank take, which ignored the R&D portion of the ACI completely....Focus on the basics. Federal funding for research and development should be applied toward basic science and technology, (such as chemistry, physics, material science and computational mathematics) rather than applied research, technology transfer or commercialization efforts. The private sector — not the federal government — has the obligation to advance the findings of basic research into marketable products and technologies. Equally troubling, legislators await the movement of a competitiveness bill in hopes they may attach pet research projects or fund a favored industry. Politicizing the process only undermines the integrity of peer review and dilutes the effectiveness of these resources. Don't over-promise. To date, Senate competitiveness bills are littered with increased authorization levels for various purposes. Billions of dollars would be needed to actually fund programs at such inflated levels. Given this scenario, reasonable authorization levels must be utilized to ensure that funding can actually be secured through the appropriations process. It would not be beneficial to repeat an example from 2002, when Congress reauthorized the NSF with the goal of doubling its annual funding. Ultimately, NSF appropriations never approached such levels. Limit new programs. Like so many other sound-bite driven "debates" in Congress, competitiveness proposals often boil down to the usual simplistic solution: Create more government programs. How many times do we have to go down this same costly road? And when was the last time we dealt effectively with a complex problem by creating new federal programs? One Senate bill would create more than 20 new programs without eliminating a single one. Dozens already exist, including the Advanced Technology Program, the Manufacturing Extension Partnership, and other questionable expenditures of funds. Congress should not create new programs without a thorough review of the value and efficacy of existing programs. Otherwise, we are merely diverting funding to new programs and layers of bureaucracy when such money could be used on basic research.
Make hard decisions. Once realistic authorization levels are established, Congress needs to make the necessary adjustments to ensure funding increases actually occur. Spending billions on a competitiveness agenda through deficit spending restricts future economic growth, and stunts future innovation and competitiveness. If we are to increase funding for a competitiveness agenda, legislation needs to include necessary rescissions and program repeals to remain budget neutral. Don't play favorites. Given the popularity of a competitiveness initiative, it is disappointing that agencies integrally involved in basic research are being ignored. For instance, NASA's basic science mission, referred to by many as its crown jewel, results in significant scientific findings. Ironically, the administration recently proposed that planned spending for these accounts be cut by more than $3 billion over the next few years, a decision NASA Administrator Michael Griffin admitted was made solely for budgetary reasons. How is this internally consistent for the administration? If done for the right reasons, a successful plan to invest new resources in scientific research can have a positive impact. Without discipline and focus, however, Congress is doomed to repeat the same mistakes, fund more failed programs and expand federal bureaucracy.
America's technology-driven economy grows despite, not because of, government intervention. That is a lesson we all need to learn before trying to "fix" what ails us.
Anyway, even if you disagree with the perspective, Sununu's OpEd is worth reading.
Another quick pointer to some articles of potential interest of readers. FCW has three noteworthy pieces today.
Michael Hardy has a good article on whether government and industry remain committed enough to research to keep the U.S. competitive in the years to come.
At one time, the United States was the undisputed center of innovative technology development in the world — and the U.S. government led the charge. Because of government research, sleek cylinders carried men to the moon, and later, sleek cables carried data worldwide, a breakthrough that would come to be known as the Internet.Times have changed. Other countries are emerging as technology centers, and the U.S. government has stepped back from its leadership position, letting the private sector try to fill the gap. Technology has made the world flat, in the words of author Thomas Friedman, so that oceans and borders are no longer boundaries to the flow of expertise and inspiration.
This evolution has many ramifications. Some fear that the United States is losing its stature as a world leader in innovation. Others point to the profit motives of industry, saying that research without a probable commercial application is less likely to get done if government doesn’t do it.
And Aliya Sternstein has two pieces of interest. One is a good survey of some of the legislation currently in circulation surrounding the competitiveness and innovation issue. The other details the NSTC Cyber Security plan we covered recently. Sternstein has a good quote from former CRA board chair (and current Government Affairs committee co-Chair) Ed Lazowzka:
Ed Lazowska, co-chairman of the President’s IT Advisory Committee from 2003 until its authorization expired in June 2005, said the government must increase funding to reach the goals listed in the report."So my entreaty to Dr. Marburger is, 'Spare me the commendations and show me the money,'" Lazowska said. "It's time for leadership and investment."
Links to all the stories:
Just a quick pointer to an interesting Computerworld article featuring comments from Kenneth Berman, Randy Bryant, John Canny, Jaime Carbonell, Bernard Chazelle, and William Dally on the current state of computer science. Here's a snippet:
How can CS be made a more attractive choice for students? Bryant: We should stop scaring them away. Predicting that all IT jobs will move offshore could become self-fulfilling. New jobs are growing faster than old jobs are moving offshore, and that trend will continue. We need to stop putting them to sleep. Students who take computer science classes in high school are taught how to write programs in Java, and their assignments have them writing code that does tedious things like sort lists of numbers. They do not learn about any of the big ideas of computer science.
Chazelle: I roll my eyes when I hear students say, "CS is boring, so I'll go into finance." Do they know how dull it is to spend all-nighters running the numbers for a merger-and-acquisition deal? No.
People have run away from CS because they are worried about outsourcing. This is a valid concern that can't be waved away by simply repeating the mantra that CS is cool.
Dally: We need to clear up many misconceptions about the field. Prospective students should understand that there are plenty of CS jobs in the U.S. and they pay well, that most CS jobs involve working with teams of people and place a premium on communication skills and teamwork -- it's not just a bunch of nerds working individually at terminals -- and that CS is so central to so many aspects of our economy that a CS education is good preparation for many careers.
Canny: We're losing in quality -- principally to bioengineering, which is now the best students' top choice -- and diversity. It's a problem of social relevance. Minorities and women moved fastest into areas such as law and medicine that have obvious and compelling social impact. We've never cared much about social impact in CS.
Read the whole thing.
The Congressional Budget Resolution -- the first real step in the annual appropriations process -- has not gotten off to the smoothest of starts. The budget resolution is Congress' response to the President's budget request and, if passed, would set the total level of discretionary spending the appropriators would have to hand out over the course of passing their annual appropriations bills. Beyond that top-level number, the rest of the resolution isn't incredibly significant. The budget resolution is divided into a number of "budget functions" that describe general areas of federal discretionary spending. "Function 250," for example, is the "General Science, Space and Technology" account, from which NASA, NSF, DOE Office of Science and DHS S&T would ostensibly receive their money. In truth, however, the budget functions described in the Congressional budget resolution only loosely correlate to the final agency appropriations levels.
[Here's the wonky digression....] If the House and Senate agree on a budget resolution, that top-level discretionary number becomes binding. It's what's called the 302(a) allocation, and it would represent the total amount of discretionary funding the government has available to spend this year. From that number, the House and Senate leadership and the respective Appropriations committees have to decide how that money gets parceled out to the appropriations subcommittees, each responsible for a single appropriations bill this year. Confusingly, the subcommittee jurisdictions don't line up neatly with the budget functions laid out in the resolution, however. Three different subcommittees, for example, are responsible for agencies that receive funding from the Function 250 account mentioned above: the Science, State, Justice, Commerce (or, more confusingly, just the Commerce, Science, Justice committee in the Senate) subcommittee; the Energy and Water subcommittee; and the Homeland Security subcommittee. Since the budget resolution doesn't specify funding levels for particular agencies, the appropriators and leadership sort of, well, wing it when it comes to parceling out the 302(b)s. Ok, it's not quite winging it, but they do only use the budget resolution to "advise" the process, not direct it explicitly.
So, why does this all matter then? And what's going on with the budget resolution this year?
The budget resolution is a prime indicator of the political climate for various funding issues. It's the first clear opportunity we get to assess the mood of the two parties -- and maybe more importantly, the various factions within the parties -- towards funding the programs we care most about. This year's budget resolution so far tells us that funding for science has strong support in the Senate, strong support from the House Democrats, and not much obvious support from the House GOP leadership. This isn't terribly surprising given recent events, but it's also not terribly encouraging as we move forward with the appropriations process.
Here's where we stand:
The Senate passed its version of the FY 07 budget resolution in mid-March. Included in the Senate resolution is enough funding for the President's American Competitiveness Initiative, plus some additional spending -- $16 billion over the President's proposed discretionary cap of $873 billion, in part to make up for cuts in Health and Education proposed in the President's budget.
On March 29, the House Budget Committee passed a more parsimonious version of the resolution, sticking to the President's cap, but not guaranteeing budget space for the President's ACI. In the House version, the account that would include funding for the ACI-targeted agencies (NSF, NIST and DOE Office of Science) along with funding for NASA -- the "Function 250" account, for which the President requested $26.3 billion -- would receive $300 million less than the President's request. (In contrast, the Senate included $100 million more than the President's request for Function 250 in their budget resolution.)
The House leadership was hoping to vote on their resolution two weeks ago, before the Congressional "Spring/Easter Break." However, that process faltered when two factions of the GOP -- the moderates and the appropriators -- rebelled and threatened to vote against the measure. The moderates don't believe the measure provides enough discretionary spending for their priorities (which, for some, include fully-funding the ACI), the appropriators are concerned about language that would force them to get approval from the budget committee before considering any "emergency supplemental" spending bills, which have proven to be attractive vehicles for pork. So the leadership pulled the resolution without allowing a vote and decided to take advantage of the two-week spring Congressional recess to try to make some deals. The leadership plans to continue working this week to strike a deal with enough GOP members to put the resolution to a vote again next week.
Failing to get a deal done could have serious consequences. In the House, it's actually not too big a problem. In the absence of a deal, the House leadership can "deem" a budget with an $873 billion discretionary cap. It opens them up to charges of being a "do-nothing" Congress from the Democrats and isn't a great showing by Majority Leader John Boehner (R-OH) in his first budget negotiation, but for all practical purposes, the House leadership would probably be fine with the $873 billion figure.
The Senate doesn't have the ability to "deem" a final number, however, so failing to reach an agreement would mean that the Senate would be forced to use the FY 07 budget number contained in the FY 06 Budget Resolution passed last year -- which would set the discretionary number at $866 billion, $7 billion below the President's request and $23 billion below the number the Senate passed last month. Finding $23 billion to cut in the President's budget won't be easy, and unfortunately, one juicy target would be the increases proposed as part of the ACI.
So, the science community is hoping that a deal can be struck to get the House and Senate numbers a little closer together. The computing community is part of the effort to urge the House leadership to include funding for ACI in the budget resolution, citing the ACI's importance to computing research and computing research's significant contribution to current and future American competitiveness. The leadership and supporters of the computing research community have taken advantage of this opportunity to put the case to the House Leadership, at a time when they can take a relatively easy step to address it (all told, the increase for R&D in the ACI is less than $1 billion). Here's the letter (pdf) that resulted (and was delivered on Friday):
SUPPORTING COMPUTING RESEARCH AND INNOVATIONThanks especially to Cisco, Intel and Microsoft who put some of their political capital on the line to sign on to this important message. Their presence is very good news for our efforts and lends considerable weight to this letter.April 21, 2006
The Honorable Dennis Hastert
Speaker
U.S. House of Representatives
Washington, D.C. 20515Dear Speaker Hastert,
As leaders and supporters of the computing research community, we write to express our concern that the proposed House Budget Resolution does not assume full funding for President Bush’s American Competitiveness Initiative. We respectfully request that Congress embrace this initiative by fully funding the President’s request in the budget resolution.
Numerous high-profile reports have pointed out the significant challenges that America faces from fierce and growing global competition. The President’s plan recognizes the critical linkage between the federal investment in fundamental research and the rise in innovation that will be required to respond to these challenges. The President’s call for increasing investment in basic research in the physical sciences represents a historic opportunity to secure the Nation’s leadership in research in information technology and other physical sciences and help ensure America’s future competitiveness.
The computing research field is a very concrete example of how federal investments in fundamental research drive economic growth. The field has a long history of creating revolutionary technologies that have enabled entirely new industries and driven productivity growth so critical to U.S. leadership in the new economy. A 2002 National Academies report found that federal support for computing research helped create 19 multibillion-dollar industries and made America the global leader in information technology. Further, several noted economists, including Alan Greenspan have cited the key role that information technology continues to play in driving U.S. productivity. Flat or declining agency budgets supporting computing research have created a significant concern within our community that we will cede these gains and our leadership by putting future innovation at risk.
The President’s American Competitiveness Initiative provides more funding for the National Science Foundation, the Department of Energy’s Office of Science, and the core labs program at the National Institute of Standards and Technology. Each agency plays an important role in funding computing research. While the House Budget Resolution does increase funding for sciences broadly, it is not clear that the increase will be enough to fund the President’s initiative. We specifically ask that the budget resolution allocate enough funding to ensure the President’s proposal can be met during the appropriations process.
Thank you for considering our request. We look forward to working with you as the Budget Resolution and appropriations for these agencies move through Congress.
Sincerely,
The American Association for Artificial Intelligence (AAAI)
The Association for Computing Machinery, U.S. Public Policy Committee (USACM)
Cisco Systems, Inc.
The Coalition for Academic Scientific Computation (CASC)
The Computing Research Association (CRA)
The Electrical and Computer Engineering Department Heads Association (ECEDHA)
Intel Corporation
Microsoft Corporation
The Society for Industrial and Applied Mathematics (SIAM)
Also good news is the fact that the President continues to tour the country making the case for the ACI. Last week the President stumped on the issue at a high school in Maryland, at Tuskegee Institute, and at Cisco in Silicon Valley. Tom Abate of the San Francisco Chronicle has coverage of the President's visit to Cisco. The visit spawned this very supportive editorial in the San Jose Mercury News. Here's a snippet:
As the president himself pointed out at Tuskegee University on Wednesday, it was through federally funded research that ``the Internet came to be.'' Other fruits of government-funded research include search technologies that spawned companies like Google, microprocessors breakthroughs that turned Apple, Sun Microsystems and Silicon Graphics into powerhouses, and countless technological advances that delivered enormous benefits to the economy. Future research in new energy technologies, for example, could help reduce America's dependence on foreign oil and turn the nation into a world leader in clean energy.The President's continued efforts and the support of industry (pdf) are crucially important in getting ACI enacted and the funding levels called for in the initiative appropriated. As that last pdf points out, the amounts we're talking about here are not large -- indeed, in the context of the federal budget they are quite literally a rounding error -- and yet the potential payoff is dramatic. Hopefully the leadership will figure that out as they decide on their allocations....And without the investment, America's eroding ability to compete globally is certain to deteriorate further. Nations such as China and India, Russia, Ireland and countless others are emerging as economic powers in part because they are willing to invest in themselves, in the education of their children and in the training of their workers.
...
The seeds of America's prosperity over the past few decades were planted in the late 1950s, when the launch of Sputnik by the Soviet Union prodded President Eisenhower to call for massive investments in education, infrastructure and research. The time to secure our children's prosperity is now.
Citing ACM's report on globalization the New York Times today editorializes on the mismatch between the perception of the high-tech industry job market and the reality.
The Association for Computing Machinery, the professional organization that issued the report, says that there are more information technology jobs today than at the height of the dot-com boom. While 2 to 3 percent of American jobs in the field migrate to other nations each year, new jobs have thus far more than made up for the loss.We covered the report from ACM's Job Migration Task Force recently in this space. Given the amount of work I know went into it, I'm pleased that the report appears to be having such significant impact....
That picture, of course, stands in contrast with the more familiar gloomy depiction of runaway outsourcing. Perhaps that explains what the report says is declining interest in computer science among American college students. Students may think, Why bother if all the jobs are in India? But the computer sector is booming, while the number of students interested in going into the field is falling.
The industry isn't gone, but it will be if we don't start generating the necessary dynamic work force. The association says that higher-end technology jobs — like those in research — are beginning to go overseas and that policies to "attract, educate and retain the best I.T. talent are critical" to future success. Given the post 9/11 approach to immigration and the state of math and science education in America, that is hardly encouraging.
Information technology jobs won't go away unless we let them. Computing in the past five years has become, according to the report, "a truly global industry." In the next few years, jobs won't just land in our laps. We have nothing to fear but the fear of competing itself.
Update: (9:40 am, 3/1) - The Wall Street Journal has a related piece today titled Market is Hot for High-Skilled in Silicon Valley. I assume it's publicly available. If not, I'll excerpt it a bit. But the lead tells the story:
Five years after the dot-com bubble burst, job growth has returned to Silicon Valley. But it's a different kind of growth than in past recoveries, favoring higher-skilled workers.
While there has been some progress in straightening out the mess that is the visa process post-9/11, as this Washington Post story indicates, the situation is still pretty bad for those who have research interests in high-tech areas.
A decision two weeks ago by a U.S. consulate in India to refuse a visa to a prominent Indian scientist has triggered heated protests in that country and set off a major diplomatic flap on the eve of President Bush's first visit to India.As Bill Wulf of the National Academy of Engineering points out later in the article, these consular officials overseas are under tremendous pressure to not make mistakes in deciding who to allow in the country. Still, if the process leads to the summary denial of entry for someone like Mehta, the process clearly needs some work.The incident has also caused embarrassment at the highest reaches of the American scientific establishment, which has worked to get the State Department to issue a visa to Goverdhan Mehta, who said the U.S. consulate in the south Indian city of Chennai told him that his expertise in chemistry was deemed a threat.
...
The consulate told Mehta "you have been denied a visa" and invited him to submit additional information, according to an official at the National Academy of Sciences who saw a copy of the document. Mehta said in a written account obtained by The Washington Post that he was humiliated, accused of "hiding things" and being dishonest, and told that his work is dangerous because of its potential applications in chemical warfare.
Mehta denied that his work has anything to do with weapons. He said that he would provide his passport if a visa were issued, but that he would do nothing further to obtain the document: "If they don't want to give me a visa, so be it."
The scientist told Indian newspapers that his dealing with the U.S. consulate was "the most degrading experience of my life." Mehta is president of the International Council for Science, a Paris-based organization comprising the national scientific academies of a number of countries. The council advocates that scientists should have free access to one another.
"Making the wrong decision would be career-ending, so they play it safe, not really understanding the macroscopic implications of their decision," Wulf said. "Denying a visa to the president of ICSU is probably as dumb as you can get. This is not the way we can make friends."You can read the whole thing here.
Ok, so that's about the most played-out cliche in politics, but it's hard to come up with another phrase that encapsulates how pervasive the competitiveness meme has become in science policy circles -- and more encouragingly, in the words of administration and congressional policymakers -- over the last year.
Also, apologies for going sort of radio silent here the last couple of weeks, but there's lots going on surrounding this issue and we're involved in some of it, which makes chatting about it a little dicey. But here's where things stand.
At the moment, all eyes (ears?) are focused on the President's State of the Union speech tomorrow night. In that speech, among the new programs and initiatives he's expected to announce may be a piece on ensuring U.S. competitiveness, which could feature a number of important planks. Now, I have no specific knowledge about what is actually in the speech, but there's been a bit of press coverage, plenty of rumors floating around town, and a few tea leaves that can be read.
It seems fairly clear that there will be a focus on education, a focus on workforce/immigration, and a focus on "innovation" that could include increased budgets for federal science angencies. One big clue is the Administration's apparent fondness for the National Academies "Rising Above the Gathering Storm" report put together by former Lockheed-Martin CEO Norman Augustine. There have been several mentions of the report by folks within the Administration. Maybe the most prominent mention was by White House Chief of Staff Andrew Card during his January 11th talk at the U.S. Chamber of Commerce. This exchange is about 47:28 into the webcast:
Question: There's a recent report from the National Academy of Sciences, put together by Norm Augustine, called "The Gathering Storm." It raises some questions about science and technology leadership in the U.S. going forward. Do you have any thoughts on that, especially as it relates to the economy, one of your key issues?We've covered the Gathering Storm report in this space, and it's filled with things we like. If the Administration embraces the report in any meaningful way -- particularly its core recommendation to "sustain and strengthen the nation's traditional commitment to the long-term basic research that has the potential to be transformational to maintain the flow of new ideas that fuel the economy, provide security, and enhance the quality of life" -- then we'll be very pleased. After all, this represents a pretty signficant (and welcome) sea change for the Administration, which until recently has maintained, as John Marburger, Director of the White House Office of Science and Technology Policy said back in March, that "the U.S. is so far ahead in [science and technology] that we are going to be able to maintain our competitive strength. I don't see the same danger signs [that others do]."Card: I would encourage you to read this report, which is The Gathering Storm. It's about our need to have more engineers and scientists in the United States. It is work that was done in the private sector under the auspices of the National Academy of Sciences, and Norm Augustine did lead the effort. There were some great academics involved.
I actually read the report, not just the summary, but the report. And it is dramatic in its exposure to that which is a problem in the United States, and how few young people are going into the physical sciences, into math, and how they're not going to college with an expectation that they'll be a an engineer, or a mathematician, or a physicist.
The life sciences have actually had a little bump up. There's some excitement about the life sciences, but on the physical sciences side, there is a dearth of students, and there is a death of teachers, and a dearth of scholarships and opportunities at some our major institutions. This report highlights that. It outlines a road map toward solving the problem. It's a ten year roadmap.
We are taking a very close look at it in the Administration. We are very forward leaning in believing it is the right issue to address. Many of the suggestions are appropriate suggestions, but we have to put them in the context of Josh Bolton's budget. And we'll be doing that.
It is a compelling report.
There are other hints that the President may be willing to adopt an "innovation" agenda, including a number of tidbits in the press. Technology Daily reports today (sub. req'd.) that some high-tech officials who have met with White House senior officials in recent days have come away optimistic about the Administrations commitment to innovation. Yesterday's Boston Globe indicates Norm Augustine will play an important role in the President's speech. And the Baltimore Sun has two pieces on the likelihood of "innovation" being a featured part of Bush's remarks. The big question is whether there will be the funding commitment to accompany any rhetorical commitment to innovation by the President.
If the President chooses to truly embrace the recommendations of the Augustine report, his budget will find a way to provide for a significant increase for the National Science Foundation, and perhaps to the National Institute of Standards and Technology and the Department of Energy's Office of Science. The Augustine report specifically recommends an increase of 10 percent a year for the next seven years for "long-term basic research...with special attention paid to the physical sciences, engineering, mathematics, and information sciences." This is the approach taken in both the National Innovation Act introduced by Sens. John Ensign (R-NV) and Joseph Lieberman (D-CT) and the new "Protecting America's Competitive Edge" (PACE) Act, introduced last week by Sens. Pete Domenici (R-NM), Jeff Bingaman (D-NM), Lamar Alexander (R-TN) and Barbara Mikulski (D-MD).
The strong bipartisan support accorded both bills in the Senate is indicative of the traction the "competiveness and innovation" case has in Congress. A year's worth of reports -- most familiar to readers of this blog -- by some of the most influential academic and industrial entities, all making the same essential points that the world has become an increasingly competitive place and that the US isn't currently doing enough to ensure our future scientific and innovative leadership, has clearly had an impact on Members of Congress -- and now, hopefully, the Administration.
But even if the President does include signficant increases for basic research in his budget, there will be a lot of work remaining. As Congress is fond of pointing out, "the President proposes, Congress disposes." This is, after all, a time of incredibly tight budgets, with lots of pressure in place to hold down increases in discretionary spending. So, step one will be making sure that the Congressional Budget Resolution includes the same support for fundamental research that we hope will be present in the President's budget. This is turn will aid in getting "302(b) allocations" (essentially, the amounts each of the 10 or 12 (House v Senate) appropriations subcommittees are allowed to spend for the bills under their control) that are robust enough to let the subcommittees that oversee the science agencies provide the any increases called for in the budget. Then it will be up to this same coalition of partners in university and industry to make the case to appropriators. In past years, the lack of a budget "cap" room has prevented even some of the most ardent congressional champions of research from providing significant increases. A strong budget request and good 302(b) allocations would remove that constraint.
So, I'm cautiously optimistic and very eager to hear the President's words tomorrow night. If the Adminstration comes through with a proposal that embraces the best of the Augustine report recommendations, it is hugely important that they, and Congress, hear from the community in support of the idea. As we've noted in the past, the case for bolstering U.S. competitiveness by bolstering U.S. innovation finds strong support in both parties. Supporting the plan need not commit you to supporting any one party.
But let's see what's in the plan, first.
The President will deliver the State of the Union at 9 pm, Tuesday, January 31st.
We'll have more after the speech (or earlier, if we get some scoop...).
Ok, we're back from our extended holiday hiatus. We'll be catching up throughout the next day or so, but I thought I'd first post a quick link to this interesting Chronicle of Higher Education Colloquy. It's entitled "The Computer Science Clubhouse":
Only 17 percent of undergraduate computer-science degrees were awarded to women in 2004, according to the Computing Research Association, down from 19 percent in 2000. Why is the number so low, and dwindling?Claudia Morrell of the Center for Women and Information Technology at the University of Maryland Baltimore County will answer questions submitted by readers on Thursday, January 12, beginning at 1 pm. So get your comments and questions in now.Are women less attracted than men to programming, as an influential study from the late 1990s indicated? Should admissions policies and curricula be redesigned with women in mind? Or will that serve only to marginalize women?
More-recent research suggests that women avoid the field because they are discouraged as children from using or playing with technology, then discriminated against in computer-science classes and high-tech workplaces. What kinds of support systems, such as mentoring programs or alumnae networks, might solve those problems?
The Boston Globe has a great, fairly in-depth piece today on the declining interest of women in computer science. Reporter Marcella Bobardieri writes:
Born in contemporary times, free of the male-dominated legacy common to other sciences and engineering, computer science could have become a model for gender equality. In the early 1980s, it had one of the highest proportions of female undergraduates in science and engineering. And yet with remarkable speed, it has become one of the least gender-balanced fields in American society.Read the whole thing (also mentions the National Center for Women and Information Technology)....
The percentage of women studying physics, already low, dropped dramatically and stayed in the single digits for decades. Eventually the physics bubble burst for men as well, and today a high percentage of the country's physicists are foreign-born.
Some computer scientists fear that they may be going in the same direction. They view the dearth of women as symptomatic of a larger failure in their field, which has recently become less attractive to promising young men, as well. Women are ''the canaries in the mine," said Harvard computer science professor Barbara J. Grosz.
In the wake of the dot-com bust, the number of new computer science majors in 2004 was 40 percent lower than in 2000, according to the Computing Research Association. The field has seen ups and downs before, and some think the numbers for men will soon improve at least a bit. But the percentage of undergraduate majors who are female has barely budged in a dozen years.
The shortage of new computer scientists threatens American leadership in technological innovation just as countries such as China and India are gearing up for the kind of competition the United States has never before faced.
Some good coverage in the press of an announcement today by Google, Microsoft and Sun that they'll help jointly fund (to the tune of $1.5 million a year for five years) Dave Patterson's new Reliable, Adaptive, and Distributed systems Lab (RAD Lab) at UC Berkeley.
Both the NY Times and San Jose Mercury News note the DARPA angle to the story -- namely, that as DARPA has pulled away from funding university-led research in computer science over the last several years in favor of shorter-term, typically classified efforts (a fact we've detailed pretty extensively on this blog), other agencies haven't stepped up to fill the gap, leaving university researchers to scramble for funding. This has put significant pressure on NSF, as formerly DARPA-funded researchers turn to the Foundation for support, and the agency is feeling the strain.
Here's how the NY Times covers it:
Mr. Patterson, currently the president of the Association for Computing Machinery, a national technical organization, has recently been a vocal critic of the shift of basic research funds away from universities and toward military contractors.I'm not sure where the "declined by 5 percent between 2003 and 2004" figure comes from. DARPA told the Senate Armed Services Committee earlier this year the drop was much more precipitous:"We're trying to sustain the broad vision, high-risk and high-reward research model," Mr. Patterson said of the new Berkeley effort.
The Berkeley researchers began looking for industry support last year when they realized that the Pentagon Defense Advanced Research Projects Agency, known as Darpa, was withdrawing support for basic research at the university, he said.
In a memorandum submitted to a Congressional committee earlier this year, Darpa officials disclosed that its spending on basic computer science research at universities had declined by 5 percent between 2003 and 2004. Government officials and corporate research executives noted the indirect effects of the changes in federal research support over the last five years.
"When funding gets tight, both researchers and funders become increasingly risk-averse," said William Wulf, president of the National Academy of Engineering.
(in millions) | ||||
Total Comp Science | $546 | $571 | $613 | $583 |
University Funding | $214 | $207 | $173 | $123 |
The Merc got it right:
The Pentagon's Defense Advanced Research Projects Agency has been one of the key financial backers of computer science research at universities. But DARPA's university funding dropped from $214 million in 2001 to $123 million in 2004, as the agency shifted its focus to classified research that favors military contractors.The drop in funding comes as computer science research is expanding.
Anyway, in Patterson's case, his group was able to make the case to three of the industry's giants that support for university research in the RAD Lab's focus area is in their best interest and secured a significant commitment from each one. While this is fantastic news for Patterson and his colleagues at Berkeley (and sure to reap big benefits for the three industry partners, as well as the rest of the industry -- that's the nature of university-led research), this is unlikely to be a model that scales very well across the country.
"There are only two or three companies with pockets that deep,'' said Phil Bernstein, a senior researcher at Microsoft Research and treasurer of the Computing Research Association. "There just aren't that many big companies, and a lot of them don't do research. There aren't a lot of doors to knock on.''So well-deserved kudos to Google, Microsoft and Sun (all members of CRA, by the way) for recognizing the value of university-led research and stepping up at a time when federal funding is in flux.
I'm just back from CRA's Grand Research Challenges in Revitalizing Computer Architecture conference -- held in lovely Aptos, California, just up the road from Monterey (and far sunnier than the snowy DC I've returned to) -- where 50 of the brightest minds in computer architecture research spent 3 days thinking deep thoughts about the field and its biggest challenges for the future. The participants are in the process of finalizing their conclusions, and when they do, you'll see them here first.
But I only bring this up as a way of explaining the lack of updates during a week that was chock full of good and important developments surrounding the science community's efforts to make the case for federal support of R&D in the physical sciences, mathematics and computing. So this post is an attempt to rectify that in one fell swoop.
It began on Tuesday:
National Summit on Competitiveness: Long-time readers may recall that
back in April, as part of the emergency supplemental appropriation to pay for Iraq and Afghanistan, House Science, Commerce, Justice, State Appropriations Subcommittee Chairman Frank Wolf (R-VA) (with help from Reps. Sherwood Boehlert (R-NY) and Vern Ehlers (R-MI)) included language directing the Department of Commerce to convene a meeting of U.S. manufacturers to discuss what could be done to buttress U.S. competitiveness in the global economy. Wolf, who has become one of the strongest champions in Congress for federal support of fundamental research, felt the conference was necessary to expose the Administration to industry concerns about the impact of the federal government's long-term underinvestment in the physical sciences.
The summit was held Tuesday (December 6, 2005) and attracted over 50 CEOs (pdf), university presidents, and agency directors, as well as four members of the President's cabinet -- Sec. Samuel Bodman (Energy), Sec. Margaret Spellings (Education), Sec. Carlos Gutierrez (Commerce) and Sec. Elaine Chao (Labor).
The good news is that the CEOs made "support for fundamental research" the primary message they brought to the cabinet officials -- a very important change of emphasis for most CEO advocacy efforts, which tend to focus on tax law changes or regulatory relief as their prime agenda items. The "Statement of the National Summit of Competitiveness" (pdf), released by the conferees immediately following the summit, puts the message bluntly:
The National Summit on Competitiveness has one fundamental and urgent message: if trends in U.S. research and education continue, our nation will squander its economic leadership, and the result will be a lower standard of living for the American people.The participants focused on six specific recommendations:
These are recommendations well-grounded in recent reports of the National Academies, the Council on Competitiveness, the Task Force on the Future of American Innovation, the Business Round Table, and many others (pdf). Whether the recommendations will resonate within the Administration remains to be seen. Until recently, the Administration has adopted a rather head-in-the-sand approach regarding the state of federal support for fundamental research. Members of the Administration continue to note that federal support for R&D has risen 45 percent since 2001, while failing to recognize that the great bulk (pdf) of that increase has been in shorter-term, defense-related development work. Long-term, basic research in the physical sciences, mathematics and computing has been flat or declining over the same period. But the persistent pressure from industry (industry has really stepped up it's involvement in this advocacy this year, as this conference demonstrated) may be having some effect. Members of the Administration (beyond the usual suspects at OSTP) are beginning to allow a level of dialog with the community that wasn't happening six months ago. (That's intentionally cryptic.) There's no guarantee that it will result in anything, but it's an encouraging development.
- Increase the federal investment in long-term basic research by 10 percent a year over the next seven years, with focused attention to the physical sciences, engineering and mathematics.
- Allocate at least 8 percent of the budgets of federal research agencies to discretionary funding focused on catalyzing high-risk, high-payoff research.
- By 2015, double the number of bachelor's degrees awarded annually to U.S. students in science, math, and engineering, and increase the number of those students who become K-12 science and math teachers.
- Reform U.S. immigration policies to enable the education and employment of individuals from around the world with the knowledge and skills in science, engineering, technology and mathematics necessary to boost the competitive advantage of the U.S.
- Provide incentives for the creation of public-private partnerships to encourage U.S. students at all levels to pursue studies and/or careers in science, math, technology and engineering.
- Provide focused and sustained funding to address national technology challenges in areas that will ensure national security and continued U.S. economic leadership, including nanotechnology, high-performance computing, and energy technologies.
Also encouraging is the imminent introduction of two separate, but very similar, bills designed to push forward an "innovation agenda" that both include substantial authorizations for increased funding for fundamental research in the physical sciences:
Ensign/Lieberman National Innovation Act of 2005: Planned for introduction on December 15th, this bill, co-introduced by Sens. John Ensign (R-NV) and Joseph Lieberman (D-CT), would enact most of the recommendations of the Council on Competitiveness' National Innovation Initiative (which we've detailed here). The bill is a pretty massive effort that includes another authorization for "doubling" NSF by 2011; establishes "Innovation Acceleration Grants", which encourage federal research agencies to allocate 3% of their R&D budgets to grants directed toward "high-risk frontier research"; makes permanent the R&E tax credit; increases NSF graduate research fellowship funding; authorizes a DOD competitive traineeship program for undergrad and grad students in defense science and engineering; and authorizes new "Professional Science Master's Degree Programs" to increase the number of qualified scientists and engineers entering the workforce. The bill is actually more of an omnibus -- it contains provisions that will likely result in referrals to six or seven different Senate committees -- which works against it getting passed in its current form. But it's an important placeholder for these issues in Congress and its likely that each of its provisions could find their way into bills that do move. Rep. Bob Goodlatte (R-VA) plans to introduce a similar measure in the House.
Alexander/Bingaman Innovation Bill: Sens. Lamar Alexander (R-TN) and Jeff Bingaman (D-NM) plan to introduce a bill soon that would enact most of the recommendations of the recent National Academies report Rising Above the Gathering Storm. We've previously covered the recommendations from that report. Alexander and former Commerce Secretary (and still close friend of the President) Don Evans recently took to the airways to talk up the report and Alexander's legislation, with Alexander telling CNBC that he was calling on the President to focus on this innovation issue in his State of the Union address in January -- which would represent a remarkable elevation of the issue. You can download the clip (about 13 megs, asf format) here.
Finally, there's been lots of good recent press on the issue. Here's some quick and dirty summaries:
In the five decades since I began working in the aerospace industry, I have never seen American business and academic leaders as concerned about this nation's future prosperity as they are today.On the surface, these concerns may seem unwarranted. Two million jobs were created in the United States in the past year. Citizens of other nations continue to invest their savings in this country at a remarkable rate. Our nation still has the strongest scientific and technological enterprise -- and the best research universities -- in the world.
But deeper trends in this country and abroad are signs of a gathering storm. After the Cold War, nearly 3 billion potential new capitalists entered the job market. A substantial portion of our workforce now finds itself in direct competition for jobs with highly motivated and often well-educated people from around the world. Workers in virtually every economic sector now face competitors who live just a mouse click away in Ireland, Finland, India, China, Australia and dozens of other nations.
In the face of report after report indicating that the United States is at grave risk of losing its technological edge — which in turn is the basis of the high U.S. standard of living — the Bush administration and the GOP Congress so far have been (to be charitable) behind the curve on science and technology.Last year, Congress actually cut the budget of the National Science Foundation, and Bush’s 2006 budget called for less funding than the agency had in 2004. Wolf won a small increase, but still not enough to match 2004.
The Department of Energy’s Office of Science, the primary funder of physics research, got just a 2.9 percent increase in fiscal 2005 and 0.9 percent this year — a cut after inflation.
The Defense Advanced Research Projects Agency, the incubator of, among other things, the Internet and laser technology, got a 5 percent increase in fiscal 2005. The House approved 4.2 percent for fiscal 2006, while the Senate called for a 1.8 percent cut.
At the innovation summit on Tuesday, Deputy Commerce Secretary David Sampson repeated the familiar administration line that research and development funding has increased 45 percent since 2001 and represents 13.6 percent of the federal discretionary budget.
Sampson also asserted that the U.S. economic growth rate, 4.3 percent, is “the fastest in the world,” that “all of President Bush’s policies — tax, research and development, education and workforce development — are dedicated to making America more competitive.”
In fact, the U.S. growth rate trails that of China (9.4 percent), Hong Kong (8.2 percent) and India (8 percent), and all the evidence indicates that those countries are far outstripping the United States in the training of scientists and investment in research and development.
...
Bush deserves credit for aggressively responding to the No. 1 threat to America’s well-being — terrorism. He needs to do better in responding to the No. 2 threat, foreign economic competition.
I don't know about you, but I sometimes grow weary hearing big-picture thinkers tell us we need more mathematicians and scientists. Maybe it was because I wasn't very interested in those subjects as a kid. Whatever, all the talk about math and science can leave my politics-and-history mind blank.Several students in Allen said the same thing the other day. When I asked a classroom of high schoolers how many wanted to study math and science in college, one student shot up his hand and said we shouldn't forget the "bohemian" side of the brain, meaning the side that worries about things like war and peace. A number of his fellow students nodded.
I like their independence, but here's the plain truth that people like me need to remember: We either champion math and science, or we lose our footing in the world. That's hard to imagine since we're the Big Cat economically, militarily and politically.
But if our schools downplay math and science, Americans will become the 7-foot basketball player who stumbled over his own clumsy feet running down the court. While we're trying to get back up, little fast guys will run right by.
Tech luminaries, academics, researchers and business leaders have been sounding alarm bells about America's eroding competitiveness in science and technology for more than a year.Anyway, as this has already turned into the mother of all blog posts, I'll stop there. But I close with the opinion that there's some reason to be reasonably optimistic about the federal priority for fundamental research changing for the better. The pressure is mounting from numerous fronts: industry is now heavily invested in making the case, significant efforts in Congress are underway, the press has cottoned on to the message, and, as I'll detail in a future post, public attitudes about federal support for research are very positive. All that's really left is for the President to make this a national priority.
In study after study, groups such as the Council on Competitiveness, the National Academies, TechNet and AeA explained the problems in clear and stark terms. The rise of tech powerhouses in China, India and elsewhere, and the parallel decline at home in math and science education, in research and development investments, and in broadband infrastructure, have put America's economic leadership and prosperity at risk. These groups also provided sensible, detailed and often strikingly similar solutions to ensure America remains No. 1.
In Washington, however, it all seemed to fall on deaf ears. Until now....
Both Democrats and Republicans need to stand for something positive going into next year's election, something that addresses the growing fear of middle-class voters that their children won't enjoy the same opportunities that they've had. Unless Congress adopts legislation to restore America's competitive edge, those fears will be warranted.
Let's hope that he does....
Business Week has a piece that ran yesterday on TechNet's annual innovation summit held earlier this week. The summit brings together TechNet's CEOs and include a few sessions taped with PBS commentator Charlie Rose. I went to the summit last year and was impressed by the event but a little disappointed that the number one focus on the agenda appeared to be the issue of expensing stock options (obviously a big concern to silicon valley CEOs). This year, it appears there's been a lot more emphasis on R&D funding and competitiveness issues, which is a very good thing. Here's a snippet:
Tech leaders fretted that falling R&D spending could cripple the U.S. in the future. "I'm very worried, as we cut back on our R&D, that we will fall behind the rest of the world," said [John] Chambers[, CEO of Cisco]. [Venture capitalist John] Doerr also lamented the lack of open-ended research at organizations like the Defense Advanced Research Projects Agency, which currently is more focused on specific programs.I have to say, one of the big reasons we're getting any traction in the science advocacy community for our issues is because industry leaders are stepping up to the plate, using some of their valuable access to decision makers to deliver this important message.Along the same lines, participants in the conference called for fewer limits on immigration. More stringent immigration limits, thanks to post-9/11 security concerns, are a big problem, said Doerr, because they're cutting the U.S. off from foreign research and engineering talent: "Imagine innovation without [former Intel (INTC ) CEO] Andy Grove, without Jerry Yang, without [Google (GOOG ) co-founder] Sergey Brin." Grove hails from Hungary, Yang from Taiwan, and Brin from Russia.
The result of immigration limits is that we're losing more foreign-born people who get educated here, said Esther Dyson, editor of the tech newsletter Release 1.0. "Right now, we're spending resources on people only to send them back to other countries," she said. "They used to stay here."
The most recent positive result of that traction is Tuesday's release of the House Democrats' Innovation Agenda. Their proposal is chock full of good things, including proposals to:
There aren't many things to disagree with in the Democrats' proposal -- indeed, just about all the ideas proposed have strong bipartisan support. The only worrying aspect of this from my perspective is that it comes crafted as a partisan document. While I would enjoy nothing more than to have the two parties battle it out to show who can support these ideas more emphatically, there's an equal risk (maybe more likely, given the current polarization) of creating a partisan divide where there needn't be one (and there isn't one now).
There are a couple of other bipartisan efforts in the embryonic state right now to enact many of these same goals. Sens. Joseph Lieberman (D-CT) and John Ensign (R-NV) are working to put the finishing touches on legislation for introduction that would implement the recommendations of the Council on Competitiveness' National Innovation Initiative; and Sens. Lamar Alexander (R-TN) and Jeff Bingaman (D-NM) are moving to craft legislation in response to the recommendations contained in the recent National Academies Rising Above the Gathering Storm report (which we've detailed previously).
So I hope the Democrats get lots of well-deserved kudos for stating so explicitly the things they're prepared to do to promote American innovation and competitiveness, and I hope it drive Congress generally towards being more supportive of efforts like the bipartisan ones noted above so we can see some real progress moving this agenda forward.
The San Diego Union Tribune has a nice piece today on supercomputing, with a particular focus on the San Diego Supercomputer Center. Here's a snippet:
Jean-Bernard Minster wants to know how a magnitude-7.7 earthquake would affect Southern California. J. Andrew McCammon wants to find a cure for AIDS. Michael Norman wants to learn how the universe began.As we reported on Wednesday, Congress restored some of that funding in the FY 06 Energy and Water Appropriations.All of them rely on supercomputers in their quest for answers.
Twenty years ago this Monday, the San Diego Supercomputer Center began using what was then the world's most powerful computer. Now, its data-crunching successors worldwide are indispensable to science, engineering, business, even the war on terrorism.
...
Fran Berman, director of the San Diego Supercomputer Center, said one way to think about these high-end tools is to compare them to high-performance race cars.
"It's not easy for you and I to buy an Indy 500 car and to maintain that," she said. "That's where it's important to have government and large-scale investment in these kinds of computers. ... And a real concern from the scientific community right now is that (U.S.) leadership is really falling behind."
In November 2004, Congress passed legislation calling for an additional $165 million a year for research to develop new supercomputers. But President Bush's fiscal 2006 budget didn't allocate any funds. Instead, it requested budget cuts for supercomputing research at the Department of Energy.
Anyway, the article is called "Supercomputing now indispensable" and it's worth a read...
Marguerite Reardon writes in CNET News.com what's becoming a very familiar refrain:
An outspoken group of information and communications technology innovators is worried that the United States is falling behind the rest of the world in technological innovation because fewer dollars are being allocated to long-term research.
Many in the research community also believe that the research being conducted today is too focused on short-term, market-oriented results. The current DARPA policy, which mandates 12-month "go, no go" research milestones for information technology, has shortened deadlines, thus discouraging long-term research. And with more research focused on national security, programs formerly open to academics are now classified. DARPA has also slashed spending on academic research.Read the whole thing here. And check here or here for good collections of similar stories that have run this year."Traditionally funding in computer sciences has come from the U.S. government," Kleinrock said. "And it's contributed to some remarkable advances, such as the Internet and artificial intelligence. They (the government) used to step back and with some direction let you go develop something new. But that's not the case today. And DARPA is no longer thinking long-range."
More competition, fewer dollars
The effects have been significant. In the last five years, IT proposals to the National Science Foundation jumped from 2,000 to 6,500, forcing the agency to leave many proposals unfunded. Other agencies, such as NASA, have also reduced spending on communications research. Since most government funding comes only from these two sources, researchers are flocking toward the NSF as DARPA cuts back or changes its priorities.
Washington Post columnist Sebastian Mallaby has an interesting op-ed today inspired by news of Microsoft chairman Bill Gates' tour of college campuses, urging students to consider majoring in computer science. The piece does a good job of making the case that Gates makes in his talks to students -- computing is a field with a history of producing really great stuff that promises to make even more really great stuff in the future.
In most fields of human endeavor, you hope for gradual improvements: an engine that's somewhat more efficient, a medicine that improves life expectancy by a few months. But computer power progresses exponentially, warping social life, intellectual horizons and the business playing field.And Mallaby lays out some examples:
Smart watches will download weather forecasts and news headlines over wireless connections. Smart phones will scan products in department stores to check where better prices can be found. Notebook computers will be portable libraries with the weight of just one novel -- libraries that allow you to scribble in the margins and share your witty insights wirelessly with friends. Your home computer will respond to instructions both written and vocal, and it won't be a computer so much as a network. Music, videos, games, photographs -- oh, yes, and all your lofty intellectual outpourings -- will be beamed around the house to a variety of screens and speakers. The tablet on the kitchen counter will display recipes and shopping lists. The plasma screen on the wall will be for family photos....All great examples of some of the foreseeable future in computing. And Gates deserves enormous credit for taking on this role of cheerleader for the field. With the current trends facing the discipline, and a general trend of US students shying away from careers in math and the sciences, this effort is sorely needed.
The only additional thing I'd wish for in these kind of presentations, especially for prospective students, would be to add more sense of purpose to the call. We've had a very interesting discussion about this piece amongst the members of CRA's Government Affairs Committee, including this great observation about what could be said about the "calling" of a career in science generally, and computer science in particular, from Peter Lee at CMU (who gave me permission to post it here):
Choosing to devote your life to science and technology is not a "normal" or "safe" choice. It is a choice made by people who are exceptionally smart, caring, and idealistic. Science makes people smarter and less scared, and it also makes the world better. Becoming a scientist means joining a community of idealists.There's not much of that in the talks that Gates is giving, but that's understandable. It's easy to lose sight of intellectual and ideological appeal when the practical applications are so plainly visible.
Anyway, a digression from the piece, but something that occurred to me and many of the other members of the committee.
The meat of Mallaby's piece comes in the final three paragraphs though, where he's right on the money:
A lot of Washington debates are about managing bad stuff: war, terrorism, natural disasters, killer viruses, budget deficits, trade deficits, medical inflation, airline bankruptcies, imploding corporate pension plans. But policy also needs to focus on the good stuff: To figure out how we can accelerate progress. If we don't fix the budget deficit, we will be setting ourselves up for economic punishment. But if we don't position ourselves to take advantage of technology, we will be setting ourselves up to miss a huge economic prize.Maybe Mallaby's seen that argument somewhere else :)What must we do to remain prize-worthy? The good news is that, in Gates's estimation, between 17 and 19 of the world's top 20 computer science faculties are American, and Microsoft hasn't yet moved many software jobs offshore. But to keep things that way we need to step up federal research funding and relax post-Sept. 11 visa rules, so that the United States remains what Gates calls "an IQ magnet." And because smart Indians, Chinese and others are more likely to return home as their countries grow freer and more prosperous, the United States must focus on growing its own talent. Last year two respected global surveys of math skills in eighth and ninth grades put the United States in 15th and 24th place, respectively. That isn't good enough.
It would take fairly little to address these problems. Last week a panel convened by the National Academies proposed a package of measures that ranged from math prizes for high schoolers to pay raises for math teachers, along with a program to boost federal research funding by 10 percent annually for seven years. The total price tag comes to $10 billion annually, but the nation spends nearly twice that amount on absurd farm subsidies. What kind of priorities are those?
The bad news is that the stars are aligning in such a way as to guarantee that there will be no increase for computer science, or the sciences generally, in the foreseeable future. The Republican Leadership is being pushed hard at the moment to find funds to "pay for" the large emergency supplements paid out for hurricane relief. Odds are those funds will come through across-the-board cuts to non-defense, non-security related discretionary spending. Look for science agencies to suffer cuts similar to last year's across-the-board 2 percent reduction (or worse).
One particular computing program is under an even bigger threat. The Senate voted to approve a $55 million cut to DARPA's cognitive computing program as part of the FY 06 Defense Appopriations bill. The out-of-the blue cut would hit DARPA's $114 million "Learning, Reasoning, and Integrated Cognitive Systems" account, effectively cutting the program in half. The House did not call for a cut in its version of the bill, so CRA is working to urge members of the conference committee to abandon the Senate cut and embrace the House number. We'll have all the details in the next post. I just thought it worthy of mention that at the same time the calls keep coming for increased support of computer science and the physical sciences, and as much progress as has been made to draw the linkage between federal investment in university research and our ability to continue to innovate, a significant percentage of our policy leadership still doesn't get it.
A quick pointer to two interesting not-directly-related pieces running today. First is Aliya Sternstein's article in Federal Computer Week that fleshes out the PITAC to PCAST switch we noted back on September 30th. She quotes CRA Chair Dan Reed and ITAA president Harris Miller:
Former PITAC member Dan Reed, vice chancellor of IT and chief information officer at the University of North Carolina at Chapel Hill, applauded [PCAST co-Chair Floyd] Kvamme's idea to examine the federal government's commitment to IT R&D.(There's a brief comment from me in there as well.)"IT pervades so many aspects of science, technology and education that examining it in a holistic context has great value," he said.
"PCAST is really the pre-eminent scientific advisory group to the president," Reed said. "In some ways, this elevates the IT issues to a higher level."
Some industry observers displayed mixed emotions about the turn of events, saying they will hold their breath until PCAST's new lineup materializes and follows through on its promises.
"Having PITAC become part of PCAST is better than nothing, but frankly, I don't think it's an adequate solution," said Harris Miller, president of the IT Association of America, which represents high-tech companies.
Although PCAST is more prestigious and well-regarded by the administration, the members already have too much on their plates, he said, adding that they likely cannot handle PCAST's huge program plus all the items that the PITAC docket would add.
The other interesting piece is by ZDNet News' Declan McCullagh and Anne Broache. It's titled "U.S. cybersecurity due for FEMA-like calamity?" and it covers the lack of adequate attention the Department of Homeland Security has paid to cyber threats to critical infrastructures.
Auditors had warned months before Hurricane Katrina that FEMA's internal procedures for handling people and equipment dispatched to disasters were lacking. In an unsettling parallel, government auditors have been saying that Homeland Security has failed to live up to its cybersecurity responsibilities and may be "unprepared" for emergencies.The article also features a nice quote from CRA government affairs committee co-Chair Ed Lazowska that sums up the concerns about the agency's research efforts:"When you look at the events of Katrina, you kind of have to ask yourself the question, 'Are we ready?'" said Paul Kurtz, president of the Cyber Security Industry Alliance, a public policy and advocacy group. "Are we ready for a large-scale cyberdisruption or attack? I believe the answer is clearly no."
But the right tools and funding have to be in place, too, said Ed Lazowska, a computer science professor at the University of Washington. He co-chaired the president's Information Technology Advisory Committee, which published a report in February that was critical of federal cybersecurity efforts.Both are worth a read!"DHS has an appropriately large focus on weapons of mass destruction but an inappropriately small focus on critical infrastructure protection, and particularly on cybersecurity," Lazowska said in an e-mail interview.
The department is currently spending roughly $17 million of its $1.3 billion science-and-technology budget on cybersecurity, he said. His committee report calls for a $90 million increase in National Science Foundation funding for cybersecurity research and development.
Until then, Lazowska said, "the nation is applying Band-Aids, rather than developing the inherently more secure information technology that our nation requires."
Mort Zuckerman, editor-in-chief of US News and World Report uses his latest column to berate the Administration for cutting the federal investment in scientific research:
The American century, as the 20th century was known, was built on scientific progress. American corporations were the first to develop major in-house research labs and the first to work closely with academic institutions. After the Soviets launched Sputnik, we went into the overdrive that put a man on the moon.Zuckerman also makes the case for the reestablishment of the Congressional Office of Technology Assessment -- an office set up during the Nixon Administration to provide non-partisan advice to lawmakers on scientific and technical matters, but eliminated in FY 96 as part of congressional belt-tightening. While I agree that the current Administration appears to have issues with scientific advisory bodies that offer advice that conflicts with its policy goals, I'm not sure reconstituting OTA will help. As a veteran of the House Science Committee staff (though after OTA was disbanded), I can attest to the value of having direct contact between Members of Congress and researchers and technologists. I'm sympathetic to arguments that OTA, by virtue of the "buffer" they created between scientists and legislators, encouraged a "bureaucratic" approach to science policy, and I think the most critical functions of the office are probably well-tended to by entities like the Congressional Research Service, the National Academies, and the Government Accountability Office. Plus, as a science advocate now, I appreciate that organizations like CRA are more relied upon by key members of Congress and staff to provide input on science and technology policy.In the second half of the 20th century, we reaped the harvest: fiber optics, integrated circuits, wireless communications, lasers, the Web, global positioning satellites, hybrid automobiles, video games, computers, and an enormous variety of medical technologies and drugs. All these inventions and discoveries transformed daily life around the world because American know-how and entrepreneurial energy married them to venture capital, then produced and marketed them.
...
Today, however, this is all being reversed. Why? Two reasons. The first is the cutback in federal support for advanced science. The second, many researchers believe, is that the Bush administration is fostering an antiscience culture. President Bush paved the way to double the National Science Foundation's budget over five years, then, just two years later, he allowed Congress to cut the projected budget by $2 billion. Cut budgets for research and training, and we won't have the economic growth tomorrow that we had yesterday. And this when we face, for the first time in our history, competition from low-wage, high-human-capital communities in China, India, and Asia. At the very least, it means fewer American jobs.
We must find the money to reverse this trend. It is not so much a current expenditure as an investment in our future. But money has to be accompanied by a recommitment to basing policy on professional analysis and scientific data from responsible agencies. An administration that packs advisory committees with industry representatives and disbands panels that provide advice unacceptable to political ideology is shortchanging the future of all of us.
But otherwise, I think Zuckerman's piece is on the money. He's certainly right about the importance of looking at federal support for research as an investment in the future of the country. Read the whole thing.
Federal Computer Week's Aliya Sternstein has an interesting piece in this week's issue on the role of computing technology in helping predict and mitigate the cost of Hurricane Katrina.
Scientists are using a range of technologies to better predict the impact hurricanes can have on the economy and environment to minimize future damage and save lives.From what I understand, NOAA does a great job with the computing resources its been allocated. I'm just not sure they've been allocated nearly enough. The article points out that NOAA has been able to upgrade its supercomputing capacity from 0.5 teraflops to 1.5 teraflops within the last year. (Supercomputers, modeling programs and geographic information systems are some of the technologies scientists use to track the movement of hurricanes and predict damage. Experts warn, however, that skilled professionals are as crucial to accurate forecasting as technology.
Supercomputers aided the National Oceanic and Atmospheric Administration in accurately forecasting Hurricane Katrina's path. The storm devastated the coastal areas of Alabama, Louisiana and Mississippi.
"Two and a half to three days before the hurricane hit, we were pretty much zoomed in on the Louisiana/Mississippi Gulf Coast as where the hurricane would hit," said Jack Beven, a hurricane specialist at the NOAA Tropical Prediction Center. "It's probably not the most accurate we've been, but it's certainly pretty accurate."
In its look at the state of computational science in the U.S. in the last year, the President's Information Technology Advisory Committee (PITAC) (now disbanded, sigh) came up with a really interesting economic case for the need for increased computational resources in hurricane forecasting. I've cited it here once previously, but I'll quote it again:
One nugget I found especially interesting from the presentation [of the PITAC Subcommittee on Computational Science] was an example of both the economic benefit and the health and safety benefit that will arise from more capable modeling enabled by advanced computing. The subcommittee noted that 40 percent of the $10 trillion U.S. economy is impacted by climate and weather. As one example of this, the subcommittee cited the hurricane warnings provided by the National Hurricane Center and the cost of the evacuations that often result. According to the subcommittee, there is $1 million in economic loss for each mile of coastline evacuated. With the current models, the U.S. now "over warns" by a factor of 3, with the average "over-warning" for a hurricane resulting in 200 miles of evacuations -- or $200 million in unnecessary loss per event. Improved modeling (better algorithms, better software, more capable hardware, etc) would improve the accuracy of forecasts, saving lives and resources.While over-warning probably wasn't much of an issue in Katrina's case, there are a number of capabilities that we currently lack that may have proven useful. Folks in the severe storms community tell me that current operational forecast models run by NOAA suffer from a number of limitations that work against obtaining accurate predictions of hurricane intensity and path. For example, they cite the lack of resolution in the current models that misses important fine-scale features like rain bands and the eye wall; the lack of coupling between atmospheric, wave and ocean prediction models; and computing resources that can generate only one or a few forecasts (as opposed to large ensembles), which impacts NOAA's ability to improve forecasting skill and quantify uncertainty.
While NOAA's move to a 1.5 teraflop capacity is a welcome change, it's still far below what one would consider a "leadership class" computing capacity for the agency -- like those available at NSF, NASA and DOE centers. I know it's a coarse measure, but 1.5 teraflops doesn't even get you in the top 300 fastest machines -- never mind a machine capable of the kind of improvements hinted at above.* And it's not all about big iron. NOAA needs additional resources to ramp up its infrastructure -- software, hardware and personnel -- and to boost basic research programs within the agency and the university community. Asking for any increase in resources anywhere is obviously very tough in the current budget environment, but the size of the "bump" required here is relatively small, given the potential benefit.
But none of this is intended to take away from the job NOAA has done with the resources it already has. Because of NOAA's forecasts, there was ample warning that this major storm was barreling in on the Gulf Coast and there were reasonable estimates of what it was going to do once it got there. But given sufficient resources the models will get even better, which means the forecasts will get better -- more accurate, more precise, and more timely. How much would it be worth to have the accuracy and precision we have now at 24-36 hours before a major storm available 3 days out? Or five days out?
I know it may seem a bit crass to be talking about boosting funding for computing only days after a tragedy as big as Katrina's impact on the gulf coast, but events like this are a trigger for the reevaluation of national priorities, and it seems to me that computing resources at NOAA haven't been a national priority for quite a while.
* Update: (9/16/2005) Actually, it looks like NOAA has slightly more adequate computing resources than the FCW article suggests. According to the Top500 list, NOAA has two machines capable of 4.4 teraflops and two capable of 1.8 teraflops. So I'm not sure what the FCW article reflects. That's still quite some distance from "leadership class" computing, trailing machines in Japan, Sweden, Germany, Russia, Korea, China, and Australia, but it's better than the figures quoted in the article above.
** Update 2: (9/16/2005) Aliya Sternstein writes to note that the 1.5 teraflop measurement cited in the FCW piece applies to the NWS system at the IBM facility in Gaithersburg, MD, not all of NOAA's computational capacity.
Last Thursday, NSF's Computer and Information Science and Engineering directorate (CISE) officially unveiled their Global Environment for Networking Investigations (GENI) initiative, a program designed to "advance significantly the capabilities provided by networking and distributed systems." As NSF points out in their fact sheet covering the program:
The GENI Research Program will build on many years of knowledge and experience, encouraging researchers and designers to: reexamine all networking assumptions; reinvent where needed; design for intended capabilities; deploy and validate architectures; build new services and applications; encourage users to participate in experimentation; and take a system-wide approach to the synthesis of new architectures.The unveiling of the initiative did not go unnoticed in the press. Wired ran with the story on Friday, quoting CRA board member Jen Rexford and UCLA's Len Kleinrock. Federal Computer Week also had coverage Friday. And today, the New York Times' John Markoff takes a look.
The program has the goal of supporting both a research program and a new "global experimental test facility" -- all for an estimated $300 million. That's a very ambitious budget number in the current environment. But making progress on the challenges posed -- how do you design new networking and distributed system architectures that build in security, protect privacy, are robust and easy to use? -- could make that $300 million seem like one of the better investments taxpayers have made. As Bob Kahn pointed out in his interview with C-Span last week, the original investment in the research behind what would become the Internet turned out to be a pretty good deal....
In any case, we'll follow the progress with the initiative as it moves forward. Any "new start" of this magnitude will require substantial effort and support from the community to demonstrate to policymakers the need addressed and opportunity presented by the new program. And we'll be right there.
The New York Times has a great piece today by reporter Steve Lohr on computer science majors -- what they do ("It's so not programming," one says), what the job market for their skills is like (pretty strong), and what some schools are doing to get the message out.
On campuses today, the newest technologists have to become renaissance geeks. They have to understand computing, but they also typically need deep knowledge of some other field, from biology to business, Wall Street to Hollywood. And they tend to focus less on the tools of technology than on how technology is used in the search for scientific breakthroughs, the development of new products and services, or the way work is done.The piece would be a great read even without the quotes from CRA's Government Affairs co-Chair Lazowska and current board Chair Dan Reed. And it's a good antidote to the more dour pieces we've seen recently about the future of the field.
...
Edward D. Lazowska, a professor at the University of Washington, points to students like Mr. Michelson [who is going to medical school at Columbia after earning a computer science degree at Washington] as computer science success stories. The real value of the discipline, Mr. Lazowska said, is less in acquiring a skill with technology tools - the usual definition of computer literacy - than in teaching students to manage complexity; to navigate and assess information; to master modeling and abstraction; and to think analytically in terms of algorithms, or step-by-step procedures.
Give it a read: A Techie, Absolutely, and More
The New York Times' John Markoff, who launched much of the media and congressional attention on computer science this year with his April 2005 piece "Pentagon Redirects Its Research Dollars", is still on the computing beat. His most recent is today's "A New Arms Race to Build the World's Mightiest Computer." Here's a sample:
A global race is under way to reach the next milestone in supercomputer performance, many times the speed of today's most powerful machines.The article highlights the recent announcements of long-term commitments by a number of countries to fund efforts to develop petaflop-scale computing systems. France, China and Japan have all initiated multi-year investments in programs designed to produce petaflop machines in the next decade. While support for supercomputing research and development here in the U.S. continues to "remain a priority" in the Administration's plans, our commitment to long-term support for the development of these leadership class machines isn't as stellar as it could be. PITAC's June 2005 report on the state of computational science in the U.S. put it a bit more bluntly:And beyond the customary rivalry in the field between the United States and Japan, there is a new entrant - China - eager to showcase its arrival as an economic powerhouse.
The new supercomputers will not be in use until the end of the decade at the earliest, but they are increasingly being viewed as crucial investments for progress in science, advanced technologies and national security.
Yet, despite the great opportunities and needs, universities and the Federal government have not effectively recognized the strategic significance of computational science in either their organizational structures or their research and educational planning. These inadequacies compromise U.S. scientific leadership, economic competitiveness, and national security.As the Council on Competitiveness is fond of noting, in order to compete in the global economy, you must be able to out-compute your rivals. The U.S. needs to ensure that it maintains a commitment to the long-term R&D that will continue to "prime the pump" for the innovations in high-end computing that will allow us to keep pace with our international competitors. Adopting PITAC's recommendations (pdf) would be a good place to start.
A nice follow-up to last week's post on the "science gap" and some of the ways the computing community is dealing with its "image problem" can be found today over
at MSNBC in a piece focusing on the new National Center for Women in IT (CRA and CRA-W form one "hub" of NCWIT -- other hubs include the Anita Borg Institute for Women and Technology, ACM, The Colorado Coalition for Gender and IT, Georgia Tech, The Girl Scouts of the USA, and The University of California). The piece is called Fewer women find their way into tech and here's a tease:
The number of women considering careers in information technology has dropped to its lowest level since the mid-1970s -- and one local nonprofit organization intends to do something about it.Based at the University of Colorado in Boulder, the National Center for Women and Information Technology (NCWIT) wants to know why women are losing interest in technology -- and what can be done to bring them back.
Read the whole thing.
The Washington Post's Politics Columnist (and resident contrarian) Robert Samuelson has an interesting Op-Ed in yesterday's edition dealing with the fact that the U.S. is producing "a shrinking share of the world's technological talent." After noting that there's a pay disparity between science and engineering PhDs and other "elites" like MBAs, doctors and lawyers that probably leads to the production disparity, Samuelson rightly points out that the simple fact that other countries are producing more S&E PhDs doesn't mean that we necessarily lose.
Not every new Chinese or Indian engineer and scientist threatens an American, through outsourcing or some other channel. Actually, most don't. As countries become richer, they need more scientists and engineers simply to make their societies work: to design bridges and buildings, to maintain communications systems, and to test products. This is a natural process. The U.S. share of the world's technology workforce has declined for decades and will continue to do so. By itself, this is not dangerous.Putting aside the fact that Samuelson apparently unwittingly puts his finger on the need for producing more US-born and naturalized S&E Phds -- after all, given current agency practices, they are essentially the only ones able to do the defense-related research that will preserve "America's advantage in weaponry" -- he's generally right on. The simple fact that other countries are producing S&E PhDs at rates higher than U.S. production isn't the worry. The worry is when America's global competition uses that newly-developed capacity for innovation and technological achievement to target sectors traditionally important to America's strategic industries. IT is one such crucial sector.The dangers arise when other countries use new technologies to erode America's advantage in weaponry; that obviously is an issue with China. We are also threatened if other countries skew their economic policies to attract an unnatural share of strategic industries -- electronics, biotechnology and aerospace, among others. That is an issue with China, some other Asian countries and Europe (Airbus).
What's crucial is sustaining our technological vitality. Despite the pay, America seems to have ample scientists and engineers. But half or more of new scientific and engineering PhDs are immigrants; we need to remain open to foreign-born talent. We need to maintain spectacular rewards for companies that succeed in commercializing new products and technologies. The prospect of a big payoff compensates for mediocre pay and fuels ambition. Finally, we must scour the world for good ideas. No country ever had a monopoly on new knowledge, and none ever will.
As Samuelson points out, one way to insure the U.S. remains dominant, especially in a sector like IT, is to make sure the U.S. continues to attract the best minds in the world to come study and work here. Unfortunately, as we've noted frequently over the last couple of years, the environment for foreign students in the U.S. is not nearly as welcoming as it once was.
Another is to nuture and grow our own domestically-produced talent in the discipline. But the challenges here are also tall. The most recent issue of the Communications of the ACM contains a very interesting (and on point) piece (pdf) about whether the computing community in the U.S. needs to do a better job of evangelizing what's truly exciting about the discipline to combat dropping enrollment rates and dropping interest in computing. The piece by Sanjeev Arora and Bernard Chazelle (thanks to Lance Fortnow for pointing it out on his excellent Computational Complexity blog), identifies the challenge:
Part of the problem is the lack of consensus in the public at large on what computer science actually is. The Advanced Placement test is mostly about Java, which hurts the field by reducing it to programming. High school students know that the wild, exotic beasts of physics (black holes, antimatter, Big Bang) all roam the land of a deep science. But who among them are even aware that the Internet and Google also arose from an underlying science? Their list of computing "Greats" probably begins with Bill Gates and ends with Steve Jobs.A recent study by the Pew Internet Project demonstrates that American teenagers are tied to computing technology: 89 percent send or read e-mail; 84 percent visit websites about TV, music or sport stars; 81 percent play online games; 76 percent read online news; 75 percent send or receive instant messages. Yet that increasing use of technology doesn't appear to make them any more interested in studying the science behind the technology. Maybe that's not surprising -- the fact that most teenagers probably have access to and use cars doesn't appear to be swelling the ranks of automotive engineers. Maybe there's a perception among bright teenagers that computing is a "solved" problem -- or as John Marburger, the President's science advisor put it at a hearing before the House Science Committee early in his tenure, maybe it's a "mature" discipline now, perhaps not worthy of the priority placed on other more "breakthrough" areas of study like nanotechnology. I think Arora and Chazelle do a good job of debunking that perception, demonstrating that computing is thick with challenges and rich science "indispensible to the nation" to occupy bright minds for years to come.
...
We feel that computer science has a compelling story to tell, which goes far beyond spreadsheets, java applets, and the joy of mouse clicking (or evan Artificial Intelligence and robots). Universality, the duality between program and data, abstraction, recursion, tractability, virtualization, and fault tolerance are among its basic principles. No one would dispute that the very idea of computing is one of the greatest scientific and technological discoveries of the 20th century. Not only has it had huge societal and commercial impact but its conceptual significance is increasingly being felt in other sciences. Computer science is a new way of thinking.
But the perception persists. Computing has an image problem. Fortunately, the computing community isn't standing still in trying to address it (though maybe it's only just stood up). At the Computing Leadership Summit convened by CRA last February, a large and diverse group of stakeholders -- including all the major computing societies, representatives from PITAC, NSF and the National Academies, and industry reps from Google, HP, IBM, Lucent, Microsoft, Sun, TechNet and others (complete list and summary here (pdf)) -- committed to addressing two key issues facing computing: the current concerns of research funding support and computing's "image" problem. Task forces have been formed, chairmen named (Edward Lazowska of U of Washington heads the research funding task force; Rick Rashid of Microsoft heads the "image" task force), and the work is underway. As the summary of the summit demonstrates, no ideas or possible avenues are off the table.... We'll report more on the effort as it moves forward.
As Arora, Chazelle and Samuelson all point out, the challenges are tall, but the stakes for the country (never mind the discipline) are even higher.
Turing Award winner Vint Cerf and ITAA head Harris Miller have a fantastic op-ed in today's Wall Street Journal raising concerns about US competitiveness in light of a declining federal R&D budget. The article is behind the WSJ pay wall, but can be viewed online for the next seven days here. Some snippets:
America will soon find its grip on the levers of international commerce slipping as we turn our backs on a proud tradition of technology innovation. The stewards of our national destiny are busily tightening the tap on the federal R&D budget, the most important source of funding for programs that seek to answer fundamental questions of science and technology.The piece goes on to quote a number of indicators -- many of the same ones cited in the Task Force on the Future of American Innovation's influential Benchmarks of our Innovation Future report -- that show that while the U.S. remains in the leadership position in innovation and R&D investments, all of the trendlines are slanting the wrong way....
In the 1960s and '70s, a collection of academics and private-sector technologists, including a co-author of this piece, used findings funded by the Pentagon's Advanced Research Projects Agency (now DARPA), to participate in implementation of the first wide-area packet switched network (the ARPANET) and the subsequent integrated collection of packet-switched networks (the Internet).
Now DARPA officials have revealed a shift in focus away from its history of open-ended long-range research, which typically has been performed in universities and nonprofit institutions. According to recent news reports, DARPA funding for university researchers in computer science has fallen from $214 million to $123 million from 2001 to 2004. Moreover, the focus of DARPA R&D is more near-term and more immediately defense-oriented than before. While this is defensible in some ways, the largest impacts of long-term research funded in the past by DARPA have been in areas that have wider or dual application to defense and the civilian sector.
The U.S. is already lagging behind in R&D funding. Our total national spending on R&D is 2.7% of our GDP, and now ranks sixth in the world, in relative terms, behind Israel (4.4%), Sweden (3.8%), Finland (3.4%), Japan (3.0%) and Iceland (2.9%). The federal government's share of total national R&D spending has fallen from 66% in 1964 to 25%.
Some of the outright cuts in the president's proposed R&D budget include the following:
The Department of Energy's Office of Science would see its R&D funding fall 4.5% to $3.2 billion.
The Department of Agriculture would see its R&D funding decline 14.6% to $2.1 billion.
Funding for all three multi-agency R&D initiatives would decline in FY 2006, a category that includes programs such as the National Nanotechnology Initiative and the Networking and Information Technology R&D initiative.
The proposed cuts come at a time when other nations have fixed their sights firmly on overtaking our technological lead, especially in information technology. For those of us in industry and academia, this shift in policy represents a major detour in the marathon race for global economic leadership.
The facile solution is to turn to private industry and academia to make up the difference. But R&D funding from private industry is currently growing above inflation. It is susceptible to general economic cycles, and by its nature it is focused on the here and now. Meanwhile, many academic institutions are battling lagging enrollment and turning to unconventional fund-raising means merely to stay afloat. The difficulty in obtaining visas for foreign scientists has also restricted an important source of talent in the research community.I'm thrilled to see this piece in the WSJ today....In a very real sense, today's R&D agenda determines where America will find itself in the future. The benefits of vigorous, federally funded academic R&D programs reaped by American society at large have been enormous. Our domestic and global economies thrive on the results of such work. Private sector programs alone cannot produce comparable results, in part owing to an ethical obligation to deliver bottom-line business results for their stockholders. The U.S. government needs a long-term strategy for continued economic growth. A strong and thriving academic R&D program is critical to that strategy. To choose otherwise is a recipe leading to irrelevance and decline.
I'll have a bit more comment on this later when I have a few minutes, but I wanted to get the pointer to the article up asap. Read the whole thing, while it's still available!
Update: The article is finding it's way around Congress. Rep. Anna Eshoo (D-CA) circulated the piece in a "Dear Colleague" letter along with this text:
Once again, high technology leaders are warning that declining federal investments in research and development are allowing the rest of the world to catch up. This isn't a problem that can be blamed on Europe or developing economies in Asia. It's a problem that we're creating. If we're to maintain our economic leadership for future generations, we need to increase the federal commitment to R&D instead of cutting it.
InternetNews.com has coverage of the opening of Microsoft Research's sixth annual Faculty Summit, a "a unique opportunity for faculty members and Microsoft researchers, architects, and executives to collectively discuss a vision for the future of computing." Microsoft Chairman and CEO Bill Gates had some interesting comments to open the event (along with ACM past-President Maria Klawe). Here's a sample:
But today, Gates and Klawe focused on the present; specifically, how to encourage more students to enroll in computer-science programs so that the industry will have enough qualified engineers to work on those future innovations.Newsday also has coverage of the event, focusing on the declining enrollment in CS/CE question:Klawe presented some grim figures: The popularity of computer science as a major has fallen more than 60 percent between 2000 and 2004, she said, even though the software engineering and several related jobs will be among the fastest growing through 2012.
Some of that slack might be taken up by girls if they didn't have such a seeming aversion to the field. Klawe said participation of women in computing has gone down over the past 25 years, with only around 15 percent of computer-science Ph.D.s going to women.
When Klawe asked Gates what could be done, he seemed to flounder. When he responded, "There's no magic answer. Maybe get women in the field to be more visible?" Klawe hooted him down.
"No, that's not the answer," she said. "We all do it, but we're not getting anywhere with it."
"You lose them at about five stages," Gates agreed. "And, if there aren't enough women in field, it makes it less attractive, even if everything else is good. There's a critical-mass element to this."
The decline in federal funding for academic research and graduate education doesn't help, the two agreed. Money from the Defense Advanced Research Projects Agency (DARPA) dropped by half last year.
"The biggest payoff for federal funding or research is in computer science," Gates said, pointing to the economic and technology boom of the 1990s. "Department of Defense money was one of the elements that allowed us to turn this into one of the greatest success periods the U.S. has ever had."
Computer science could fuel another such boom in the next 10 years, according to Gates.
"Computer science is becoming the toolkit for all the sciences," he said. As all disciplines become more data-driven, they're turning to computer science to make sense of the huge amounts of data. "Computer science helps model the world," he added.
Speaking to hundreds of university professors, Microsoft Chairman Bill Gates said Monday that he's baffled more students don't go into computer science.Both pieces are chock full of interesting quotes and worth reading. We'll have more on how the computing research community is organizing to take on these issues soon, so watch this space....Gates said that even if young people don't know that salaries and job openings in computer science are on the rise, they're hooked on so much technology _ cell phones, digital music players, instant messaging, Internet browsing _ that it's puzzling why more don't want to grow up to be programmers.
"It's such a paradox," Gates said. "If you say to a kid, 'Yeah, what are the 10 coolest products you use that your parents are clueless about, that you're good at using,' I don't think they're going to say, 'Oh, you know, it's this new breakfast cereal. And I want to go work in agriculture and invent new cereals or something.' ... I think 10 out of 10 would be things that are software-driven."
...
Gates said computer scientists need to do a better job of dispelling that myth and conveying that it's an exciting field.
"How many fields can you get right out of college and define substantial aspects of a product that's going to go out and over 100 million people are going to use it?" Gates said. "We promise people when they come here to do programming ... they're going to have that opportunity, and yet we can't hire as many people as we'd like."
Update: Here's the transcript from Gates and Klawe's opening remarks. And here's a video.
There's an interesting article by Sallie Baliunas at Tech Central Station today on research funding. The piece notes a recent Nature article that suggests scientific misbehavior might be linked to "perceptions of inequities in the [science] resource distribution process" and connects that with tendency among federal funding agencies to shift emphasis from basic to applied research.
Since 1970, total federal non-medical research spending as a fraction of Gross Domestic Product has declined by about one-third. No formal history has tracked research misbehavior, leaving it impossible to say if ongoing stresses on budget allocation systems would partly explain current misbehavior.The rest of the article focuses on this trend, citing as an example PITAC's 1999 report "Investing in our Future" that noted that federal funding in computing research was "excessively focused on near-term problems" (a problem that persists) and providing examples of the sort of serendipitous discovery that doesn't occur in that environment.Continual budget pressures, though, are transforming U.S. research and development. Funding agencies now weigh more heavily a proposal's aim toward practical applications, especially those with near-term payoff.
Though I'm not sure what to make of the linkage between this change in focus and scientific misbehavior, the article's point on the real cost of the push towards applied research is well-taken. "Questions of how funding is distributed are as critical as how much funding."
Here's the whole thing.
Two interesting stories came through the Triangle (North Carolina) Business Journal over the weekend focusing on the lack of undergraduates majoring in CS and CE. The first one, entitled "Fewer students majoring in industry could lead to labor shortage," notes that CS enrollments in at North Carolina State and the UNC campuses have dropped from 1,988 in 2000-01 to 1,333 in 2004-05. The story was picked up and nationally syndicated by MSNBC. A second story focuses on the lack of minorities entering CS-related fields.
Both stories quote Andrew Bernat and cite the CRA as a key source. Could this be a sign that at least the business media are showing an increased interest in computing research and its effects on the American economy?
Additional news stories mentioning CRA can be found at http://www.cra.org/reports/news/index.html.
Aliya Sternstein of Federal Computer Week has a piece today on the demise of the latest iteration of PITAC. It's a good summary of the situation, which we've covered in this space previously. Plus, it's got a good quote from Dan Reed, the incoming Chair of CRA:
"People are a little demoralized about the fact that PITAC hasn’t been renewed," Reed said.Read the whole thing here.It would be unfortunate if PITAC does not get the chance to review the nation's IT research, Reed said. "Six years in the information technology business is a lifetime, and it seems opportune," he said today. "My personal hope is that PITAC will be reconstituted quickly."
The New York Times editorializes today that, despite the very real threat, the nation continues to be woefully unprepared to defend against a "cyberattack" on our critical infrastructure.
Power grids, water treatment and distribution systems, major dams, and oil and chemical refineries are all controlled today by networked computers. Computers make the nation's infrastructure far more efficient, but they also make it more vulnerable. A well-planned cyberattack could black out large parts of the country, cut off water supplies or worse. The Nuclear Regulatory Commission found that in 2003 a malicious, invasive program called the Slammer worm infected the computer network at a nuclear power plant and disabled its safety monitoring system for nearly five hours.As we've noted previously, the President's IT Advisory Committee came to a similar conclusion in its report (pdf) on Cyber Security R&D, released last March. That report concluded that the federal government is largely failing in its responsibility to protect the nation from cyberthreats and recommended an immediate increase in the amount of support for cyber security research at NSF, DHS, and DARPA, and greater emphasis on civilian networks in addition to military-oriented networks.Despite the warnings after 9/11 - and again after the 2003 blackout - disturbingly little has been done. The Government Accountability Office did a rigorous review of the Department of Homeland Security's progress on every aspect of computer security, and its findings are not reassuring. It found that the department has not yet developed assessments of the threat of a cyberattack or of how vulnerable major computer systems are to such an attack, nor has it created plans for recovering key Internet functions in case of an attack. The report also expressed concern that many of the department's senior cybersecurity officials have left in the past year. Representative Zoe Lofgren, the California Democrat who was among those who requested the G.A.O. report, said last week that it proved that "a national plan to secure our cybernetworks is virtually nonexistent."
Unfortunately, the early results of this appropriations season show that the recommendations for DHS continue to go largely unheeded....
Update: Ed Felten has a thoughtful post at Freedom to Tinker on the difficulty of addressing the cyberthreat problem with government action.
The Chronicle of Higher Ed today has coverage (free until 6/2 apparently) of the May 12th House Science Committee hearing on "The Future of Computer Science Research in the U.S." that's generally pretty good. But it makes an odd point at the end that doesn't accurately represent what went on at the hearing. Here's the paragraph:
[DARPA Director Tony] Tether challenged Mr. [Tom] Leighton [, co-founder and Chief Scientist at Akamai Industries] and Mr. [Bill] Wulf [, President of the National Academies of Engineering] to supply examples of important projects that the agency has refused to support, and they did not immediately offer any. That shows, Mr. Tether said, that the agency's priorities are properly placed.At the end of the 2 hour, 19 minute hearing, in response the committee's very last question, Tether told the panel that in dealing with the university computer science community he saw "a lot of hand-wringing" but didn't get many "actionable ideas" from the community. Science Committee Chairman Sherwood Boehlert then turned to Wulf and Leighton and asked if they could take that as a challenge and provide a list to the committee and to Tether. Both responded that they'd be happy to and Boehlert noted that he'd make that part of the post-hearing questions that will be put to the witnesses (and noted the challenge in his press release).
I understand both Wulf and Leighton are eager to respond to the challenge. Leighton told me after the hearing that he was getting ready to wave the PITAC report on Cyber Security R&D as a start (the focus of much of his testimony), which contains specific recommendations in 10 areas of cyber security research currently under-supported. Both Leighton and Wulf will be reaching out to the community to craft a list that will be most useful to DARPA and DOD and most responsive to the committee's request (which hasn't yet been received, as far as I know). There are plenty of resources from which to draw -- PITAC's Cyber Report, Defense Science Board, CRA's Grand Challenges conferences, National Academies reports, etc.
The idea that either Wulf or Leighton were dumbstruck by the question is just wrong, and the idea that the community lacks an adequate response to the committee's challenge is equally wrong.
Otherwise, the article does a decent job of summarizing the hearing. From my perspective, the hearing was incredibly useful. I could spend a lot of space here dissecting the testimony of Marburger and Tether -- though frequent readers of the blog won't need my dissection to spot the points of contention in both sets of testimony. Tether essentially argued in his oral testimony (and half of his written testimony) that DARPA has reduced its funding for university-led computer science research because maybe it's focusing on multi-disciplinary research now; something Tether apparently deduced by looking at university web pages, he says. But in the appendix to his testimony, he provides the response to the same question he gave to the Senate Armed Services Committee, compiled by the DARPA comptroller, which includes these five reasons for the shift:
1. A change in emphasis in the high performance computing program from pure research to supercomputer construction;
2. Significant drop in unclassified information security research;
3. End of TIA-related programs in FY 2004 due to congressional decree, a move that cost universities "a consistent $11-12 million per year" in research funding;
4. Research into intelligent software had matured beyond the research stage into integration;
5. Classified funding for computer science-related programs increased markedly between FY 2001 and FY 2004, but Universities received none of this funding.
From my perspective, having the DARPA director stand before the committee (literally) and affirm that the agency has significantly reduced its support for university-led, long-range computing research was very useful. The community can raise concerns about DARPA's priorities, but ultimately it's up to the Director and the Administration to set them as they see fit. What's more important to me is that the impact of DARPA's (now undisputed) withdrawal on the overall IT R&D enterprise be adequately assessed and addressed. The gap that DARPA leaves is substantial -- both in terms of monetary support and in losing a funding model that has contributed so much to the extraordinarily productive environment for innovation that is the computing research community. NSF is great at what it does -- funding individual investigators and research infrastructure at universities -- but there was substantial value from DARPA's approach of focusing on particular problems and nourishing communities of researchers to address them. Without DARPA, that approach is largely absent in the federal IT R&D portfolio.
It was also useful for the Science Committee to get exposure to the concerns the community has had with DARPA over the last several years. Tether's performance -- literally standing before the committee (I staffed a lot of hearings for the House Science Committee under two different chairmen and never once saw a witness rise before the committee and wander around the hearing room while testifying...), delivering remarks 15 minutes over the 5 minute time limit imposed by the committee, and most importantly, being largely unresponsive to the three questions the committee posed to him prior to the hearing -- confirmed to the committee Chair and staff that the concerns the community had shared with them had merit. The result is that the committee intends to remain engaged on this issue, which is to the community's great benefit, I think.
The committee plans to proceed with the issue in the coming months in non-hearing venues. I'll bring you developments as this moves forward during the summer and fall.
ACM president and former CRA board chair David Patterson writes a pointed Op-Ed at C-Net today about whether the U.S. will lead critical IT innovation in the 21st Century, or whether the changing landscape for support of fundamental IT research will constrain that innovation pipeline.
If declining U.S. research funding simply slowed the pace of IT innovation, perhaps the upcoming House Science Committee hearing wouldn't be as critical to the nation as it is to the research community. However, the rest of the world isn't standing still.Patterson's piece follows his earlier editorial with Edward Lazowska on "An Endless Frontier Postponed" (pdf) which runs in this week's issue of Science. Both pieces are well-timed given tomorrow's hearing of the House Science Committee on "The Future of Computer Science Research in the U.S.," which you can watch via the committee's real-time webcast.Chinese Premier Wen Jiabao recently went to India to propose co-development of the next generation of IT, with China producing hardware and India developing software. He predicted the coming of the Asian century of the IT industry, as both countries strive for worldwide leadership in IT.
The history of IT is littered with companies that lost substantial leads in this fast-changing field. I see no reason why it couldn't happen to countries. Indeed, at the recent International Collegiate Programming Contest of the Association for Computing Machinery, four Asian teams finished in the top dozen, including the champion, while the best U.S. finish was 17th, the country's worst showing ever. If current U.S. government policies continue, IT leadership could easily be surrendered to Asia.
Allow me to suggest two questions for the hearing: Could loss of IT leadership--meaning, for example, that the IT available to the U.S. might be inferior to that of China or India--lead to a technological surprise akin to the one with Sputnik 50 years ago? And, if the U.S. must face serious competition for leadership, isn't it better to attract the best and brightest to U.S. universities to come and work to help grow the American economy, rather than have them innovate elsewhere?
We'll have lots more on the hearing later today and tomorrow....
Apparently inspired by this week's Science editorial by Ed Lazowska and Dave Patterson (covered here), the Los Angeles Times today editorializes on DARPA and university IT research.
Since 1961, the Defense Advanced Research Project Agency, or DARPA, has distributed IT research dollars in largely open-ended grants to universities. The grants encouraged basic research aimed not at marketable innovations but at basic scientific mysteries. DARPA and its investments have paid off handsomely nevertheless.Here's the whole thing.Its legendary role in developing the Internet as a free-for-all instead of a commercially owned space is widely known. Less so are its militarily and commercially important developments, such as global positioning satellites, the JPEG file format for efficiently storing photographs and Websearching technologies like those later refined by Google.
Since the terrorist attacks of 2001, however, Homeland Security officials have pushed DARPA to rein in its democratic funding systems. Grants once available to universities can now flow only to military contractors, and graduate student support once open to the most excellent thinkers can be offered only to U.S. citizens. Administration officials say the changes are needed to keep technological innovations out of the hands of potential terrorists. The effect may be instead to dampen imagination itself.
The collection of articles and editorials addressing this issue since the story first ran in the New York Times back on April 1, 2005 (covered previously) is almost too long to list. But I've done my best here.
Edward Lazowska and David Patterson (both former CRA board members and current members of the President's Information Technology Advisory Committee) have penned an excellent OpEd (sub. req'd) in this week's issue of Science magazine on the impact of the changing federal landscape for support of computing research. The OpEd makes a case that will be familiar to readers of this blog: the unique environment responsible for the IT innovations that drive much of the new economy is at risk by recent shifts within the federal IT R&D portfolio.
U.S. IT research grew largely under DARPA and the National Science Foundation (NSF). NSF relied on peer review, whereas DARPA bet on vision and reputation, complementary approaches that served the nation well. Over the past 4 decades, the resulting research has laid the foundation for the modern microprocessor, the Internet, the graphical user interface, and single-user workstations. It has also launched new fields such as computational science. Virtually every aspect of IT that we rely on today bears the stamp of federally sponsored research. A 2003 National Academies study provided 19 examples where such work ultimately led to billion-dollar industries, an economic benefit that reaffirms science advisor Vannevar Bush's 1945 vision in Science: The Endless Frontier.The OpEd's conclusion is stark:However, in the past 3 years, DARPA funding for IT research at universities has dropped by nearly half. Policy changes at the agency, including increased classification of research programs, increased restrictions on the participation of noncitizens, and "go/no-go" reviews applied to research at 12- to 18-month intervals, discourage participation by university researchers and signal a shift from pushing the leading edge to "bridging the gap" between fundamental research and deployable technologies. In essence, NSF is now relied on to support the long-term research needed to advance the IT field.
Other agencies have not stepped in. The Defense Science Board noted in a recent look at microchip research at the Department of Defense (DOD): "[DARPA's] withdrawal has created a vacuum . . . The problem, for DOD, the IT industry, and the nation as a whole, is that no effective leadership structure has been substituted." The Department of Homeland Security, according to a recent report from the President's Information Technology Advisory Committee, spends less than 2% of its Science and Technology budget on cybersecurity, and only a small fraction of that on research. NASA is downsizing computational science, and IT research budgets at the Department of Energy and the National Institutes of Health are slated for cuts in the president's fiscal year 2006 budget.
At a time when global competitors are gaining the capacity and commitment to challenge U.S. high-tech leadership, this changed landscape threatens to derail the extraordinarily productive interplay of academia, government, and industry in IT. Given the importance of IT in enabling the new economy and in opening new areas of scientific discovery, we simply cannot afford to cede leadership. Where will the next generation of groundbreaking innovations in IT arise? Where will the Turing Awardees 30 years hence reside? Given current trends, the answers to both questions will likely be, "not in the United States."As I mentioned previously, the piece contains a link to the a page here at CRA HQ that's sort of a one-stop shop for information relating to IT R&D policy. Ed has also placed a link to a pdf version of the article on his website.
The OpEd appears in an issue of Science devoted to distributed computing issues, with articles on Grassroots Supercomputing, Grid Sport: Competitive Crunching, Data-Bots Charting the Internet, Service-Oriented Science, and more. The timing of the issue also couldn't be better, given the the House Science Committee will hold a full committee hearing on "The Future of Computer Science Research in the U.S." on Thursday, May 12th. You can catch the details here, or watch it live on the Science Committee's real-time webcast (also archived).
And keep an eye out for future editorials....
Roll Call's Morton Kondracke writes in an OpEd (sub. req'd) that Congress must act to increase federal support for fundamental research or risk future competitiveness. The good news is, he notes, is that Rep. Frank Wolf (R-VA), Chair of the House Appropriations Subcommittee on Science, State, Justice, Commerce committee, appears to be up to the challenge.
Wolf, who has led Congressional campaigns against gambling and has focused national attention on religious persecution and other human rights violations around the world, is now putting together an agenda to reverse America's decline in science.Much of the credit for influencing Wolf's position has to go to the Task Force on the Future of American Innovation (of which CRA is a member). Their Benchmarks of Our Innovation Future (pdf) report seems to be resonating well with congressional offices, and special efforts to reach out to Wolf (who has been very receptive) seem to be paying off.On April 12, he and two House colleagues - accompanied by former Speaker Newt Gingrich (R-Ga.) - announced the introduction of legislation to have the U.S. government pay the interest on undergraduate loans for students who agree to work in science, math or engineering for a five-year period.
Wolf also favors holding a blue-ribbon national conference on technology, trade and manufacturing where leaders of industry would highlight the danger to U.S. leadership. He wants to triple funding for federal basic-science programs over a period of years.
...
Wolf told me in an interview, rather diplomatically, that "I personally believe that [the Bush administration is] underfunding science. Not purposefully. I think we have a deficit problem, and previous administrations have underfunded it also."Gingrich is less diplomatic. "I am totally puzzled by what they've done with the basic-research budget," he told me. "As a national security conservative and as a world trade-economic competition conservative, I cannot imagine how they could have come up with this budget."
He continued: "There's no point in arguing with them internally. They're going to do what they are going to do. But I think if this Congress does not substantially raise the research budget, we are unilaterally disarming from the standpoint of international competition."
Now the trick is to turn that enthusiasm into real appropriations -- something that remains a real challenge in current budget environment. We'll keep you posted.
This OpEd (free link) by Norman Ornstein, a fellow at the American Enterprise Institute (a reasonably influential conservative think tank -- Newt Gingrich is also a fellow), ran today in Roll Call (sub req'd), "the Newspaper of Capitol Hill." It's a strong defense of federal support of basic research that cites DARPA's declining support for university computer science research as one of the flawed policy decisions that need correcting to preserve our future competitiveness. Here's a snippet:
But I am growing increasingly alarmed, less because of the dynamism in Asia and more because of our blindness and obtuseness when it comes to our crown jewel: our overwhelming lead in basic research and our position as home to the best scientists in the world.And with that, the (bipartisan) chorus of voices grows....Basic research is the real building block of economic growth, and here we have had the franchise; just look at the number of Nobel Prize winners from the United States compared to the rest of the world combined. Our academic institutions and research labs have been magnets attracting, and often keeping, the best and the brightest. Our academic openness and our culture of freedom have encouraged good research and challenges to orthodoxy. Our politicians have recognized that most basic research has to be funded by the government because there is scant short-term economic benefit for most businesses to do it themselves.
But now, in a variety of ways, we are frittering away this asset, and for no good reason. Start with the federal budget. Basic research has been concentrated in a few key institutions: the National Institutes of Health, the National Science Foundation, the National Institute of Standards and Technology, and the Defense Advanced Research Projects Agency at the Pentagon. After a series of pledges to double the NIH budget and then keep it on a growth path, NIH has stagnated. Budget growth for next year is one-half of 1 percent, which will be below inflation for the first time since the 1980s, at a time when the need for more biomedical research is obvious.
The NSF budget is slated to grow by 2 percent, leaving it $3 billion below the funding level Congress promised in 2002. At NIST, the Bush administration is trying to eliminate the Advanced Technology Program and to slash the Manufacturing Extension Partnership by 57 percent. At DARPA, which originated the Internet but where computer science research has been flat for several years, the money going to university researchers has fallen precipitously, along with a larger focus on applied research for the here and now.
...
It is gut check time. The foolish fiscal policies that keep big entitlements off the table, won’t consider revenues along with spending, and have turned the one-sixth of the budget that is discretionary into a vicious, zero-sum game, are truly eating our seed corn in this critical area. Somebody needs to get the White House to wake up, and Congress to understand what it is mindlessly doing.
The Seattle PI makes the economic case for federal support of R&D in an editorial today.
But what happens if the United States not only gives up every trade protection benefit, continues to suffer a loss of manufacturing and fritters away its research leadership in science, medicine and technology?Read it all.That's a lose-lose proposition. And it ought to worry U.S. leaders a lot more than it has so far.
Following in the wake of news stories and OpEds in the New York Times, the San Jose Mercury News editorializes today on the negative impact of DARPA's shift away from university researchers in computer science and engineering.
Of all the government sources of funding for basic technology research, few have delivered more breakthroughs for Silicon Valley and the U.S. economy than the Pentagon's Defense Advanced Research Projects Agency, or DARPA.Fortunately, it appears that Congress is getting interested in having that debate. In early May the House Science Committee will hold a hearing on the issue. Testifying before the committee will be John Marburger, Director of the White House Office of Science and Technology Policy; Tony Tether, Director of DARPA; Bill Wulf, President of the National Academy of Engineering; and Tom Leighton, Co-Founder and Chief Scientist at Akamai Industries, and Chair of the PITAC Subcommittee on Cyber Security, which just released it's review of the federal government's cyber security R&D programs. We, of course, will bring you all the details.That's why a shift away from basic and university research in DARPA funding is alarming for the valley and for the future of innovation in the United States. Long-term casualties could eventually include America's competitiveness and military readiness.
...
The shift at DARPA is all the more troubling as it goes hand in hand with decreases in funding for basic research across the Pentagon and at the National Science Foundation. What's more, these subtle yet significant changes have occurred without a national debate.
The time to have that debate is now. If these trends continue, America will pay dearly for them.
In the meantime, read the full editorial.
Since Sue, Ed, Andy, and a whole host of my relatives have all sent me a pointer to this OpEd by Thomas Friedman in the NY Times, you may have already seen it. But that doesn't make it any less worth noting.
Friedman picks up where former Clinton defense officials Perry and Deutch left off earlier in the week (which we covered here), who picked up where NY Times reporter John Markoff left off a couple weeks earlier (which we covered here), arguing that the Bush Administration, by cutting the U.S. investment in fundamental research, has put not only our national security at risk (as noted by Perry and Deutch), but our economic security at risk as well.
The Bush team is proposing cutting the Pentagon's budget for basic science and technology research by 20 percent next year - after President Bush and the Republican Congress already slashed the 2005 budget of the National Science Foundation by $100 million.Of course, when Friedman writes regarding the National Innovation InitiativeWhen the National Innovation Initiative, a bipartisan study by the country's leading technologists and industrialists about how to re-energize U.S. competitiveness, was unveiled last December, it was virtually ignored by the White House. Did you hear about it? Probably not, because the president preferred to focus all attention on privatizing Social Security.
It's as if we have an industrial-age presidency, catering to a pre-industrial ideological base, in a post-industrial era.
Did you hear about it? Probably not...he's obviously not referring to readers of this blog, who read all about the Council on Competitiveness report back on December 15th. :)
Friedman has hit the Administration and Congress hard (and repeatedly) for allowing NSF to be cut in the FY 2005 appropriations, so I'm glad to see him continue to bang the drum for federal support for fundamental research.
So, read the whole thing, and thanks to Sue, Ed, Andy and my relations for pointing it out.
The New York Times has an interesting OpEd today from former secretary of defense William Perry and his former undersecretary John Deutch on the lack of support for basic research, applied research and advanced technology development (collectively, "Defense Science and Technology") at the Department of Defense.
Of the Pentagon's $419.3 billion budget request for next year, only about $10.5 billion - 2 percent - will go toward basic research, applied research and advanced technology development. This represents a 20 percent reduction from last year, a drastic cutback that threatens the long-term security of the nation. Secretary of Defense Donald Rumsfeld should reconsider this request, and if he does not, Congress should restore the cut.While it's not earth-shattering that members of the previous administration might question the priorities of the current administration, the OpEd adds to the chorus of voices expressing concern about DOD R&D trends.These research and development activities, known as the "technology base" program, are a vital part of the United States defense program. For good reason: the tech base is America's investment in the future. Over the years, tech base activities have yielded advances in scientific and engineering knowledge that have given United States forces the technological superiority that is responsible in large measure for their current dominance in conventional military power.
Worth reading the whole thing.
And watch this space for news of yet another influential voice raising concerns....
Washington Post science and technology writer Rick Weiss riffs off of the recent news that NASA plans to pull the plug on the Voyager missions to demonstrate that the U.S. support for research has become too mundane -- too evolutionary rather than revolutionary, too focused on short-term gains versus long-term results. The two Voyager probes, three decades after being launched on their tour of the outer planets, are now tickling the edge of interstellar space and still sending back data. NASA's FY 2006 budget request eliminates funding for the Voyager program and a suite of other space probes (total cost savings = $23 million in FY 06) as part of the agency's effort to refocus on the President's Moon/Mars initiative -- an initiative that has led to significant cuts elsewhere in the agency as well. Unfortunately, the problems aren't just limited to NASA:
It would be less disheartening if the move to kill the Voyager program were an isolated example. But the U.S. scientific enterprise is riddled with evidence that Americans have lost sight of the value of non-applied, curiosity-driven research -- the open-ended sort of exploration that doesn't know exactly where it's going but so often leads to big payoffs. In discipline after discipline, the demand for specific products, profits or outcomes -- "deliverables," in the parlance of government -- has become the dominant force driving research agendas. Instead of being exploratory and expansive, science -- especially in the wake of 9/11 -- seems increasingly delimited and defensive.We've covered the DARPA story and its impact on computer science research pretty extensively (latest here).Take, for example, the Pentagon's Defense Advanced Research Projects Agency -- arguably the nation's premier funder of unencumbered scientific exploration, whose early dabbling in computer network design gave rise to the Internet. Agency officials recently acknowledged to Congress that they were shifting their focus away from blue-sky research and toward goal-oriented and increasingly classified endeavors.
Similarly, in geology, scientists have for years sought funds to blanket the nation with thousands of sensors to create an enormous, networked listening device that might teach us something about how the earth is shifting beneath our feet. The system got so far as to be authorized by Congress for $170 million over five years, but only $16 million has been appropriated in the first three of those years and just 62 of an anticipated 7,000 sensors have been deployed. Only in fiscal 2006, thanks to the South Asian tsunami, is the program poised to get more fully funded -- out of a narrow desire to better predict the effects of such disasters here.
The Department of Energy in February announced it is killing the so-called BTeV project at Fermilab in Batavia, Ill., one of the last labs in this country still supporting studies in high-energy physics. This field, once dominated by the United States, promises to discover in the next decade some of the most basic subatomic particles in the universe, including the first so-called supersymmetric particle -- a kind of stuff that seems to account for the vast majority of matter in the universe but which scientists have so far been unable to put their fingers on.
"We seem to have reached a point where people are so overwhelmed by the problems we face, we're not sure we really need more frontiers," said Kei Koizumi of the American Association for the Advancement of Science, noting that the only segments of the nation's research and development budget enjoying real growth are defense and homeland security.
Anyway, it's a good piece -- it even starts with a Star Trek quote. Read it all here.
John Markoff writes in detail in Saturday's NY Times about DARPA's diminishing investment in university-based computer science research and its potential impact.
The Defense Advanced Research Projects Agency at the Pentagon - which has long underwritten open-ended "blue sky" research by the nation's best computer scientists - is sharply cutting such spending at universities, researchers say, in favor of financing more classified work and narrowly defined projects that promise a more immediate payoff.Markoff's piece is largely based on answers the agency provided the Senate Armed Services Committee in response to the committee's questions about DARPA's historical support of IT R&D and the role of universities. In their response, DARPA noted that their overall support for computer science activites has averaged $578 million a year (inflation adjusted) for the last 13 years and that university participation in that research over the last 4 years has plummeted. (Due to "data constraints" they don't have figures prior to FY 01.) In FY 01, DARPA funded $546 million in IT research overall, $214 million in universities. By FY 2004, the overall funding had risen to $583 million, and the university share had dropped to $123 million.Hundreds of research projects supported by the agency, known as Darpa, have paid off handsomely in recent decades, leading not only to new weapons, but to commercial technologies from the personal computer to the Internet. The agency has devoted hundreds of millions of dollars to basic software research, too, including work that led to such recent advances as the Web search technologies that Google and others have introduced.
The shift away from basic research is alarming many leading computer scientists and electrical engineers, who warn that there will be long-term consequences for the nation's economy. They are accusing the Pentagon of reining in an agency that has played a crucial role in fostering America's lead in computer and communications technologies.
"I'm worried and depressed," said David Patterson, a computer scientist at the University of California, Berkeley who is president of the Association of Computing Machinery, an industry and academic trade group. "I think there will be great technologies that won't be there down the road when we need them."
DARPA cited five "factors for the decline":
1. A change in emphasis in the high performance computing program from pure research to supercomputer construction;
2. Significant drop in unclassified information security research;
3. End of TIA-related programs in FY 2004 due to congressional decree, a move that cost universities "a consistent $11-12 million per year" in research funding;
4. Research into intelligent software had matured beyond the research stage into integration;
5. Classified funding for computer science-related programs increased markedly between FY 2001 and FY 2004, but Universities received none of this funding.
Essentially, they conceded that their focus in IT R&D is increasingly short-term (at least in the unclassified realm) and that universities are no longer significant performers of DARPA IT R&D (classified or unclassified). Not surprisingly, these are the two major concerns CRA has repeatedly cited about the agency.
Anyway, the article is a must read.
Update: (4/3/2005) - Noah Shactman at Defense Tech has a bit more:
Federal Computer Week has a depressing article today on the impact of recent and planned cuts to NASA's IT programs. The agency's IT R&D programs are due to decline $66 million in FY 2005, with a further cut of $89 million requested in the President's FY 2006 budget -- a figure that would represent a total cut of 60 percent since FY 2004. The Administration says that NASA's investments in IT R&D in FY 2006 will be reduced across the board, largely due to redirected funding to the President's Moon/Mars initiative and the Space Shuttle Return to Flight program -- the same reason given for the FY 2005 cuts that are putting pressure on agency supercomputing efforts now.
FCW says the cuts in FY 05 will result in 15 to 20 layoffs of NASA Ames' supercomputing staff and 20 to 25 layoffs in its robotics staff (currently at 70 and 100, respectively). Buyout packages are being offered.
Chris Knight, vice president for negotiations at Ames Federal Employees Union and a Computational Sciences Division employee, said the buyouts apply to all IT workers except three in visualization and robotics. But the amounts will not be enough to convince most people to leave, he said.Read the whole article.“A lot of the research centers are being basically bled dry,” Knight said.
BBC coverage of the jump to 135.5 teraflops.
ZDNet has more.
The feat won't show up on the current Top500.org list until they release the next revision of the list, which I think will be in May (the last was released in November at the Supercomputing 2004 conference in Pittsburgh, and it seems to be issued at six month intervals).
Update: John West, Director of the ERDC MSRC -- one of four DOD HPC program centers -- e-mails with a helpful clarification:
Top500 lists are published twice a year: in June and in November. The November list is announced at the annual Supercomputing series of conferences (www.supercomp.org), which is probably part of the reason for its not-quite-six-months timing.He also notes that the LINPACK score (upon which the Top500 list is based) isn't the best way to assess a supercomputer's relative benefit to a discipline, despite it's popularity -- something I probably should have noted in my post.
In my defense, as limited as the LINPACK score is in what it says about a particular machine, it is the one number most people out here (certainly in the policy world) cling to when trying to understand progress in supercomputing. Though it wasn't the message we sought to convey, the fact that the Japanese Earth Simulator was X teraflops faster than our "best" machine certainly focused the mind of a lot of policymakers in Congress last year, for better or worse. In talking about high-end computing with them, we certainly tried not to emphasize that measure; rather, we tried to talk about the importance of a sustained research effort on a diverse set of approaches to enable progress on a wide range of different problems.
John also notes that there are some interesting efforts to develop a new metric coming out of DARPA's HPCS program, but those measures are likely to be a bit more complex -- almost certainly spelling doom for their adoption over the "one number fits all" of the LINPACK.
The long-awaited PITAC report on Cyber Security, Cyber Security: A Crisis of Prioritization (pdf, 2.2mb) has just been released. The committee spent nearly a year reviewing the federal government's cyber security R&D effort, a process we've covered in this space. The resulting report concludes that the IT infrastructure -- beyond the public Internet -- is a crucial piece of the nation's critical infrastructures, such as power grids, air traffic control systems, financial systems, and military and intelligence systems. Given it's importance, the committee finds that the federal cyber security R&D investment is inadequate and "imbalanced" towards short-term, defense oriented research, with little support for fundamental research to address the larger vulnerabilities of the civilian IT infrastructure. As a result the committee recommends changes to the portfolio to:
I'll have more detail on the report as I work my way through it, but wanted to get a link up to it ASAP. At 72 pages cover-to-cover, the report is a very revealing examination of the federal cyber security R&D portfolio.Increase Federal support for fundamental research in civilian cyber security by $90 million annually at NSF and by substantial amounts at agencies such as DARPA and DHS to support work in 10 high-priority areas identified by PITAC. Intensify Federal efforts to promote recruitment and retention of cyber security researchers and students at research universities, with an aim of doubling this profession’s numbers by the end of the decade. Provide increased support for the rapid transfer of Federally developed cutting-edge cyber security technologies to the private sector. Strengthen the coordination of the Interagency Working Group on Critical Information Infrastructure Protection and integrate it under the Networking and Information Technology Research and Development (NITRD) Program.
Update: (3/19/05) - The NY Times' John Markoff has more on the report today, including this quote from PITAC co-Chair Ed Lazowska:
"The federal government is largely failing in its responsibility to protect the nation from cyberthreats," said Edward D. Lazowska, chairman of the computer science and engineering department at the University of Washington and co-chairman of the panel. "The Department of Homeland Security simply doesn't 'get' cybersecurity. They are allocating less than 2 percent of their science and technology budget to cybersecurity, and only a small proportion of this is forward-looking."For the record, it may be true that DHS' overall budget for "cyber security" activities has gone up, but cyber security R&D -- the focus of this report and, one would think, a focus of the DHS Science and Technology directorate -- has actually been flat at DHS for the last two fiscal years at a paltry $18 million out of an overall S&T budget of just about $1 billion per year. And of that tiny share only $1.5 million could truly be called "long-term" research -- research beyond patching the holes in the current systems. As the report points out, without research into fundamentally new approaches, we'll be "endlessly patching and plugging holes in the dike" for years to come. It's also worth noting that the President's budget for cyber security research at DHS this year actually takes a step backwards. For FY 2006, the President's budget would cut cyber security R&D at the agency to $17 million, a decrease of $1 million from FY 2005....Michelle Petrovich, a spokeswoman for the Department of Homeland Security, disputed the criticism. "We take cybersecurity seriously and have taken aggressive measures to address various needs," she said. "Our cybersecurity budget has gone up every year."