About Sigma Xi Programs Meetings Member Services Chapters Giving Affiliates Resources American Scientist

Programs » Lectureships » Past Lecturers » 2007-2008 Lecturers » Abstracts

Sigma Xi Distinguished Lecturers 2007-2008 Abstracts

John B. Carberry

Sustainable Industry in a Changing Society (P, G)
Changing Expectations of Society and Government: Over the past 40 years, the expectations of the public, regulators, and the government about the impact of industry on the environment has changed from almost laissez-faire to a strong and dynamic demand for sustainability. This trend was first manifest in a large array of laws and regulations that can be particularly expensive unless addressed proactively and, during the last decade has been increasingly reflected in the marketplace. This presentation seeks to review the major environmental issues and trends over that period along with methods to anticipate or otherwise prepare for these rising expectations along with examples of creative responses to these issues. The concepts of green manufacturing and sustainable businesses will be outlined in the context of the business case for meeting those expectations.

Sustainability in the Chemical Industry (S)
Markets, Requirements and the Bottom Line: Based on an analysis of almost two decades of environmental programs in chemical industry manufacturing, engineering oriented footprint and operating cost issues will be identified along with examples of ways to address those issues. In general, that will result in a focus on either programs that need to be done for defensive reasons or opportunities to provide a better product for your customers. For both situations, both the environmental concern and the business requirement will be reviewed. In addition, changes in expectations by customers, regulators, and the local community will be reviewed and their effects will be projected on future products. Finally, the emerging concepts and engineering needs to address truly sustainable facilities and products will be discussed.

Helen M. Berman

How the History of the Protein Data Bank Informs the Future of Biology (S)
Since its grassroots beginnings in 1971, the Protein Data Bank (PDB) has been the archive for the for the three-dimensional coordinates of experimentally-determined biological structures. Today, it is a resource used by researchers and students studying the structures of biological macromolecules and their relationships to sequence, function, and disease.

This presentation will explore how the PDB developed from a simple repository for data from X-ray crystallographic experiments to a "knowledgebase" for structural biology.

Creating a Data Resource for Biology: Lessons from the Protein Data Bank (P,G)
There are many considerations when building a community resource for enabling science. One is the necessity of a scaleable infrastructure that can handle vast amounts and different types of data. This infrastructure must also be able to adapt to new and changing technologies. Another concern is how to solicit and incorporate the needs and wants of a variety of user communities. How are policies created and enforced? A case history of a global resource for scienceľthe Protein Data Bank (PDB)ľwill be presented.

The PDB has been the archive for the three-dimensional coordinates for experimentally-determined biological structures. Today, it is a resource used by researchers and students studying the structures of biological macromolecules and their relationships to sequence, function, and disease.

Daryl E. Chubin

Supply, Demand, and Something Else (G,S)
Historically, the production of scientific knowledge has subsumed the production of new practitioners, treating human resources as a byproduct of what really matters in science and engineering (S&E) scholarship measurable in articles, citations, patents, and an array of prizes. R&D funding policy will not solve education and workforce dilemmas. Because the US has never had a human resource development policy, federal policies that address human resources have been fragmented categorically by group, segment of the education system, sector of the economy, institutional type, etc. The national interest gets lost. Neither OSTP nor the cognizant committees of Congress emerge with a systemic view of needs, trends, and options in building and sustaining the S&T workforce.

Globalization only complicates matters. National security clashes with visa policy. For US citizens, the policy focus ranges from women faculty to holistic college admissions to student retention, and the seemingly inexorable growth in the number of postdocs. Clearly, these issues are at the vortex of demography, economy, education, and employment. Hanging in the balance are careers, institutional vitality, and the composition not just the size of the future S&T workforce. To meet America's need for world-class talent in science and engineering, higher education must develop an emerging U.S. talent pool that looks very different from decades past. As diversity by gender, race, ethnicity, disability, citizenship, region, discipline, etc. has grown, our policy responses strike many as antagonistic, divisive, bureaucratic, and/or unworkable. This lecture suggests how we can do better.

Tipping Points in Academe (P)
Malcom Gladwell's best-seller, The Tipping Point, distills much research wisdom about when communities change and why. Marketed as a business strategies book, its social and behavioral science examples speak persuasively to what is not only possible, but predictable, in the behavior of those formerly complacent or distracted by life's routines. Within those routines dwell traditions that grip us all. The problems appear intractable. But they are not. The work of the AAAS Capacity Center (www.aaascapacity.org) with client institutions of higher education demonstrates that, with leadership, departments and colleges on campus can introduce practices that support the success of students and faculty. Change is possible if context is heeded. Indeed, innovation can be spread to other parts of the institution. Examples (with institutional anonymity assured) from the Center's portfolio will be discussed.

Measuring Program Impact: How to Evaluate the Value-Added (G,S)
As interventions in student learning proliferate, the call for accountability what difference do they make continues to grow. Based on reviews of undergraduate and graduate programs developed in an array of institutional settings and distilled in the 2004 BEST report, A Bridge for All (www.bestworkforce.org), this seminar will offer templates for faculty and administrators seeking to measure processes and outcomes of programs as educational experiments. The presenter's current work with college- and university-based projects funded by federal and corporate sponsors provides a corpus of questions to ask and approaches to consider.

Science in a Bigger Frame (P)
Whether it is education, trade, or public policy, science these days seems to dominate the social landscape. This is new territory for a science community that is used to speaking to itself and its sponsors. Now there are other consumers nonscientists eager to use and abuse science. It is as much a target as a resource. As a result, public policy becomes a focus of conflict through the political process.

An earlier era spoke of science literacy and public understanding. Indifference was lamentable, but appreciation of science was within reach. Today, we have no such illusions. Stem cell research, the teaching of evolution, and nanotechnology are political issues as well as scientific problems. Experts will disagree. Customs used within science to adjudicate differences in judgment, e.g., peer review, remain as mysterious as the technical disagreements themselves. Who mediates and translates? How do we experts and citizens alike converge on meanings and courses of action? This lecture addresses science as one cultural pursuit with consequences well beyond the science community.

Nicholas K. Coch

Are America's Beaches all washed up? (P)
Extensive coastal development, lack of hurricane danger perception, modification of shorelines with engineering structures and a rising sea level have increased the danger to coastal inhabitants and their structures. This talk describes how coastal systems work and evolve naturally as well as how anthropogenic and natural changes are causing problems on our coasts. Major problems will occur as we continue to build fixed structures on a moving shoreline. What are our options?

Hurricane Hazards in the U.S. (P)
Hurricanes pose a major problem for the Gulf and Atlantic coastal regions because recent research suggests that in the next decades, hurricane frequency, and possible intensity, will increase. Our coastal areas are more vulnerable to damage because of overbuilding, alterations from engineering structures and rising sea level. This talk describes the basic mechanics of hurricane damage and how these work on different types of shorelines. No major hurricane has hit a major urban coastal population in the last century - we are statistically overdue. It is vital that we develop the effective hurricane management program before the " Big One" hits the United States shoreline.

Hurricane Damage along the (New England, Mid-Atlantic, South Atlantic, Gulf) Coast. (G)
The New England, Mid-Atlantic, South Atlantic, Gulf Coasts each have a different hurricane landfall frequency and potential for destruction. Any one of these regions may be chosen for discussion. The lecture will review past history, present state, specific vulnerabilities, and susceptibility to future damage.

Unique Vulnerability of New york City to Hurricane Destruction (G)
The unique demographic, oceanographic, topographic, bathymetric, and geographic conditions of the New York-New Jersey Metropolitan region greatly amplify the effects of land falling hurricanes. Based on historical records, past Category 2 hurricanes have done Category 3 damage and Category 3 hurricanes caused damage equivalent to Category 4 hurricanes in the South. This talk deals with how this major urban coastal section will fate in the future. Hurricane landfalls in this area are infrequent, but their consequences can be catastrophic. What will happen when the "Big One" hits the "Big Apple?"

Forensic Hurricanology and the Reconstruction of Historic Hurricanes (S)
Forensic Hurricanology utilizes information obtained from recent hurricanes to interpret damage patters in historic hurricanes. The causal mechanisms for given damage patterns is obtained from modern quantitative data. Damage descriptions from a wide variety of historic sources provides descriptions of damage patterns. These are interpreted in light of the recent data. Inferences are made about wind speed, radius of maximum winds, surge levels and translational velocities from the historical data. Areal plotting of the data makes a reconstruction of the wind field possible. Joint analysis of the data with specialists from the National Hurricane Center, make it possible to produce dynamic computer models of the 17th through 19th century hurricanes.

Dynamics of Hurricane Destruction by Wind, Waves, Surge and Inland Flooding - Facts and Fallacies (S)
Hurricanes cause damage by wave attack, surge flooding and wind at the shoreline and by wind and inland flooding away from the coast. Recent studies have shown that hurricanes are not just coastal events, but can spread damage hundreds of miles inland if they make a high-angle landfall with the coast. The lecture reviews all of these types of damage and provides new insights into each from recent hurricanes. Damage mitigation measures have had mixed success in past hurricanes. It is time for new ideas and a modern perspective before the inevitable major hurricane hits one of our large urban coastal areas.

Adela I. de la Torre

Sex on the Border: Risky Practices and HIV (G)
Economic literature on Mexican female sex workers (FSWs) asserts that financial need is the primary motivation for labor market entry. If the price differential between protected and unprotected sex is significantly higher, negotiating condom use may be difficult for FSWs. The purpose of this paper is to examine differential prices FSWs charge for sex with and without a condom in two Mexican border cities.

Data were used from a larger intervention study conducted among adult FSWs in Mexico. A total of 277 FSWs were interviewed in Tijuana and 345 FSWs in Ciudad Juarez. Price differentials were evaluated in both sites revealing that significant price differentials exist in both sites, controlling for market and non -market characteristics of sexworkers such as sexual power and attractiveness. These results suggest that a higher price is paid for unprotected sex in both cities, which may act as a deterrent to safer sex practices. This price incentive effect may be significant over the long term when communities consider the long-term sustainability of prevention and intervention programs.

Improving Latino Health Outcomes: Identifying Best Outreach Practices for Public Health Insurance (P)
In a study examining the differential impact of Medicaid expansions on the health status of children by race and ethnicity, Lykens and Jargowsky (2002) have pointed out that access to public health insurance programs depends on three distinct realms of action. To benefit from public health insurance programs, such as Medicaid and the State Children's Health Insurance Program (SCHIP), an individual must first qualify for, then enroll in, and ultimately take advantage of the care plan available (Lykens and Jargowsky 2002). Although this is a seemingly simple statement of fact, each of these three realms presents a unique set of difficulties for Latinos, especially in terms of their ability to benefit from Medicaid and SCHIP programs. This paper identifies the barriers to enrollment for Latinos in the publicly subsidized programs such as SCHIP, reviews the limited body of literature on culturally innovative interventions, particularly those that have been shown to be effective in reducing rates of under-enrollment among Latinos eligible for public health insurance programs. Finally, based on focus groups in Chicago, Florida and California as well as review of state reports identifying outreach practices targeted to Latinos, best practices will be identified to improve Latinos' access to public health insurance programs in an attempt to improve their well being and quality of life.

Immigration Policy and Immigration Flows: A Comparative Analysis of Immigration Law in the US and Latin America (P)
This talk presents a comparative analysis of US and Latin American immigration policies, with specific reference to immigration policy in Mexico and Argentina. Similar to the US, many Latin American countries are viewed as target sites for immigrants from Asia as well as from bordering countries with relatively lower wages and limited occupational opportunities. Nevertheless, these countries often provide more progressive strategies for economic and political incorporation of legal and illegal immigration within their borders. Although the absolute magnitude of migration is not equivalent to US legal and illegal migration in Latin American countries, the relative impact on specific sectors of these Latin American countries may be similar to that experienced in the US. Moreover, the Federal approach to immigration reform in Latin American countries is less influenced by populist rhetoric resulting in a less hostile and more practical approach to incorporating immigrants, both legal and illegal, into the social and economic fabric of these countries.

Kerry Emanuel
American Meteorological Society - Sigma Xi Lecturer

Divine Wind: The History and Science of Hurricanes (P,G)
Hurricanes have inspired literature and art through the ages and changed the course of history. In this lecture, I will discuss the science of hurricanes and their role in human history, ending with a discussion of the effect of climate change on hurricane activity.

Is Global Warming Affecting Hurricanes? (G,S)
Analysis of historical records of hurricane activity reveals large variability from one decade to the next. How much of this variability is random, how much can be said to be part of natural, regional or global climate fluctuations (such as El Nino), and how much is tied to man-made global climate change? These are important questions, as their answers bear on the pressing question of how hurricane activity might change over the next century. I will review the evidence that hurricane activity is closely linked to sea surface temperature and then examine the various environmental processes that cause sea surface temperature to change, focusing on the role of human-induced climate change.

Hurricane Physics (S)The Hurricane Embryo (S)
Hurricanes are nearly perfect examples of heat engines, driven by an evaporative enthalpy flux from the ocean to the atmosphere, and operating over a temperature differential of more than 100 K between the sea surface and the storm top. I will demonstrate that the thermodynamic cycle of a mature hurricane is very close to that of Carnot's maximally efficient cycle, and go on to talk about the physics of the genesis and intensification of hurricanes, focusing on remaining problems.

The Hurricane Embryo (S)
Although the physics of mature hurricanes are fairly well understood, their genesis remains enigmatic. They can be shown to be finite amplitude instabilities, arising from a subcritical bifurcation of the radiative-convective equilibrium state of the Tropics. In this talk, I will suggest that the establishment of a mesoscale (~100 km in diameter) column of nearly saturated air extending through the troposphere is a necessary and perhaps sufficient condition for tropical cyclogenesis, and that such a development strongly suggests that generating systems must pass through a phase in which a cold-core cyclone exists at mid and upper levels of the troposphere.

Adam M. Finkel
Society for Risk Analysis Distinguished Lecturer

Both Sides Now: Misguided Attacks on Risk Assessment and Cost-Benefit Analysis (P,G)
Given something as common-sensical as "compare the costs and benefits of each proposed action to protect health or the environment," it may be surprising how angry this activity makes people - and how it breeds critics on both extremes of the ideological spectrum. As a developer and user of risk analysis methods, I have been branded as contributing to a "vicious circle" of exaggerated risk fueling unwarranted public fears-although I have shown in my research that actually, these methods often underestimate risk, while also overestimating the costs of regulatory controls. At the same time, many environmentalists attack risk assessment for being insufficiently protective, inevitably immoral, and contributing to "paralysis by analysis," criticisms which I also view as misplaced. This talk will explore the political landscape of risk assessment, show how indispensable it has been to national and international policies to address health, safety, and environmental issues, and argue that we should neither abandon it nor allow critics to hijack it in order to bludgeon the public into "declaring victory" over unsolved environmental problems. I will summarize scientific and economic research to shed light on the crucial question of whether current methods of analysis exaggerate or underestimate risks and costs. A key aspect of this talk will be to shine a spotlight on the controversial notion of "precaution," to show how it can be used in opposite ways by different decision-makers, and how it can instead be made into an explicit and useful part of individual and social choice.

130 Million Neglected: the Fall of Worker Safety and Health in the U.S. (P,G,S)
More than 130 million Americans spend roughly half their waking hours at work, and as a consequence face risks of accidental death and chronic disease that can dwarf similar risks encountered anywhere else in life. The American workplace is unquestionably safer now than at any time in the past, and in some cases, features some of the safest and "cleanest" work environments in the industrialized world. However, the number of fatal accidents at work recently began to increase, after decades of steady decreases down to a level (roughly 5,000 deaths per year) that many still regard as unacceptably high. Scientists estimate that perhaps ten times as many U.S. workers die prematurely each year as a result of exposures to hazardous substances, and we lack the information to discern whether this number is rising or falling. For 35 years, the U.S. Occupational Safety and Health Administration (OSHA) has been charged with setting and enforcing standards to improve workplace conditions, with (as one point of reference) roughly 5% the budget and staff of the Environmental Protection Agency. Have these resources been adequate for the task, and has the agency used them wisely, in the face of some hazards known to observers for millennia and others that have only arisen with cutting-edge changes in products and processes? Having just completed 10 years as the chief regulatory official at OSHA and the chief enforcement official in the Rocky Mountain states, I have an insider's perspective on what OSHA has achieved, and where it has let the country down. The talk will emphasize practical recommendations for increasing the cost-effectiveness of our national worker-protection system, and for fulfilling the promise of the federal agency at its center.

Modernizing Quantitative Risk Analysis in Light of Human Inter-individual Variability (G,S)
We often estimate and communicate risks to health by means of the "body count," as in "42,000 people were killed in automobile crashes last year." Most of the federal and state regulations governing environmental and occupational health risks seek to lower individual risks to an "acceptable" probability, such as one chance in one million. In either case, scientists and regulators assume that one number can describe the situation for every citizen. When we start from the "body count," we often implicitly or explicitly recast it as the individual risk to the average person (in the example above, 42,000 deaths in a population of 300 million Americans yields a risk of 1.4 per 10,000); when we start from a direct estimate of an individual risk, it sometimes is intended to represent an atypical person at high risk, and sometimes the average person. None of these situations accounts for the substantial variation in individual risk that real people face, as a consequence of the widely different circumstances of exposure and the vast differences among us in our genetic predispositions, overall health status, and other factors. This talk will explain how inadequate single estimates of risk can be, in two related arenas where they are used: risks to health and safety in communities and workplaces, and results of medical screening tests (or predictions of the risk of medical interventions). I will emphasize the recent explosion of knowledge about the human genome, but will also provide examples of ignoring human differences that are much more obvious, and will conclude with policy recommendations for improving risk assessment and risk management to enlighten and benefit individuals as well as populations.

The Odyssey of a "Vindicated Whistleblower" (P,G,S)
We have an elaborate system of laws and institutions that are supposed to promote three (possibly conflicting) goals: allow witnesses to fraud, malfeasance, and "deadly neglect" to shed light on such conditions without fear of reprisal; protect whistleblowers who are harassed or punished for their disclosures; and protect agencies and companies from frivolous or erroneous disclosures. I have seen this system from both sides, first as a Regional Administrator for OSHA (the U.S. Occupational Safety and Health Administration), the federal agency that investigates charges of retaliation against private-sector whistleblowers in workplace, environmental, financial-accounting, and other arenas, and most recently as a litigant against my own agency. In 2002, after protesting internally against my agency's decision not to provide inexpensive blood tests to our own employees who had been exposed to dangerous levels of beryllium dust in the course of conducting inspections of contaminated facilities, I discussed my concerns with a reporter. On the day the article about the decision appeared, I was stripped of my executive position and transferred across the country with my family. I eventually won a substantial settlement from OSHA, and the agency later revealed that the first round of medical tests had uncovered health problems in our workforce at twice the prevalence I had predicted. This talk will summarize the current information about chronic beryllium disease at OSHA and in the much larger private-sector workforce, but it will focus primarily on the lessons I learned as a target of the very agency charged with investigating worker-health and whistleblower retaliation cases. I will pay particular attention to the way scientific claims were evaluated at OSHA and at the U.S. Office of Special Counsel (the agency that is supposed to protect public-sector employees), and try to put my experiences in context of other much more well-known stories of whistleblowing and their consequences.

John R. Gersh

Visualization in Action (or, how is Mission Control like a thermostat?) (P,G)
Scientific visualization and data mining techniques have enabled the discovery and presentation of insights from complex sets of data. We need to do more than see and understand, though; people use information as the basis for action. We drive cars, control spacecraft, cook dinner, direct autonomous vehicles, fight battles. We use information gleaned from computer-driven displays of the world around us to evaluate what's happening, decide what to do about it, implement an action, and assess the results. This, ideally, involves visualizing and implementing actions, plans, goals, and tasks, all in the context of states of our systems and of the world. Considerations of cognition, automation roles and supervisory control, human-computer interaction, and knowledge representation all play in the design of systems from video recorders to robot spacecraft. For example, spacecraft control is traditionally accomplished through long strings of individual commands to spacecraft components (open the valve, dump the memory, turn on the sensor). Concepts for more autonomous spacecraft, though, involve on-board planning and replanning of higher-level tasks (take some pictures, send them home, respond to component failure). How should the plan be depicted to mission controllers? How about deviations from the original plan? How can system designers and engineers accomplish objectives like these? What theoretical and practical issues are involved in providing visual representations of what we want our automated systems to do and providing indications that they will, in fact do what we want in evolving situations? What pitfalls lie in store for unwary designers?

"What was I thinking?" - Capturing Analysts' Insights (G,S)
Information analysis and knowledge discovery tasks usually extend over significant periods of time and can involve collaborative teams of people. How can an analyst share insights and discoveries with colleagues and with and his or her later self? Intelligence analysis, business planning and even scientific discovery can all involve querying complex sets of data, visualizing the results, and discovering interesting items or patterns in them. These computer-supported sensemaking activities combine to provide, one hopes, insight into what's going on in the world and why. As Hamming famously said, "the purpose of computing is insight, not numbers." If that is the case, though, the explicit representation of such insight within the information system could potentially benefit the analysis process. Insight in information analysis is a tricky concept, though. Of what does insight consist? What is a general set of attributes by which "an analytical insight" may be characterized? How can the development of insight be recorded as an analyst explores a complex collection of information? How can we handle changes in insight over time? How can insights be further explored, monitored for validity, elaborated, compared, and shared? The talk will cover the development of a simple framework for representing, depicting, and transforming analytical insights, the motivation for this work, its relation to sensemaking theory, examples of visualization and interaction, and application to intelligence analysis problems.

Diane Gifford-Gonzalez

Animal Disease Challenges to the Spread of Pastoralism in Africa: Archaeological and Epizootiological Perspectives (G,S)
African savannas today are home to numerous pastoralist cultures, but archaeology of sub-Saharan Africa reveals not one but two puzzling delays in the implantation of such cattle-based economies from the Sahara-Sahel region. Domestic cattle are found in the present-day Sahara by at least 8000 years ago. DNA studies suggest they may have been independently domesticated in northern Africa, and any case, pastoralists thrived there for nearly 5000 years. However, in both East and South Africa, development of cattle-based economies seems to have been delayed up to a thousand years after sheep and goats first appearance. Sleeping sickness is perhaps the least threatening of four diseases lethal to cattle today that probably hindered successful introduction of cattle-based economies into these regions.

Before Farming and Villages: Early Pastoralists of the Sahara (P,G)
Emerging evidence from the Sahara and from cattle DNA suggests that ancient Africans followed a very different path toward food production than taken by ancient Near Eastern peoples. Sedentary life around Saharan lakes and rivers during the moist phase immediately after the Ice Age is richly evidenced, as is 9800-year-old pottery. Domestic cattle are found in northern Africa at least 8000 years ago, and DNA studies suggest they may have been independently domesticated in the region. In any case, pastoralists thrived there for nearly 5000 years, without any trace of domestic plants. How did this divergent path emerge? What finally led to domestication of sorghum, millet, yams, and other crops? Why does this seem puzzling?

The Case of the Disappearing Fur Seals: How Bones, Isotopes, and Ancient DNA Are Helping Solve a Prehistoric Mystery (G,S)
Archaeological sites along the southern Californian to Alaskan coasts testify to a different distribution of eared seals than is historically documented. This is especially true of the northern fur seal which was among the most common pinnipeds in many archaeological sites up c 1000 AD. Our research team has applied new age determination methods, bone isotope analysis, and ancient DNA to reconstruct prehistoric fur seals' foraging and reproductive habits. We have strong evidence for a substantial resident fur seal population along the coast of "Lower 48," with multiple rookeries on the Oregon and California coasts, and for substantial differences in the species' weaning behavior. The relevance of our work to ongoing debates about fur seal conservation and over aboriginal peoples' role in their disappearance well before European contact is discussed.

Ancient Farming in Africa: Creating New Species along a Distinctive Path (P)
Ancient African farming and herding has been misunderstood not only by the general public but also by scientists. Only recently has enough archaeological evidence come to light to propose that African peoples brought numerous indigenous plant species under domestication under harsh climatic conditions, after millennia of using pottery, domesticated cattle, sheep, and goats, and intensively harvesting wild plants. This path to farming looks very different from that of people in the Near East, in Mexico, or in South America, but all these independent cases of domestication share an underlying logic.

Manjit S. Kang

Can Crop Scientists Help Feed Ten Billion in 2050? (P)
From the standpoint of food, the world is one civilization and until everyone had food to eat, there would be no peace or prosperity in the world. — David Brinkley.

About 3 billion of the current 6.5 billion people in the world live in poverty, 815 million suffer from hunger, and half of the children in the poorest countries are malnourished. Population is projected to be 9 to 10 billion by 2050. Arable land in the world is limited (currently estimated to be 8.57 billion hectares). During the past 50 years, agricultural research and technology have helped increase the output of world crops two and a half-fold. Nobel Laureate Norman E. Borlaug estimates that world food production would have to triple by 2050 to feed a population of 10 billion. Can this be done?

Genetically Modified Crops: Frankenfoods or Boon for the Poor? (G)
Genetically modified (GM) crops have been developed through years of dedicated research that is generally dubbed as "genetic engineering." Proponents of GM foods argue that the benefits of GM foods, like improved flavor or increased nutritional value, outweigh potential risks. Opponents of GM foods do not believe enough government regulations are in place to control the production and distribution of these foods. They say there has not been enough research or long-term testing on these foods and no one knows what might happen as a result of growing and eating GM foods or how the environment might be affected.

With more than half of the soybeans and approximately 25% of the corn grown in the U.S. are GM varieties, chances are high that you are eating food containing ingredients derived from GM crops. Are they safe?

In famine-ridden countries, given the choice between starvation and GM food, which one would they choose?

What Can GGE Biplot Analysis Do for Plant Scientists? (S)
The newly developed GGE Biplot methodology, based on principal component analysis, is a revolutionary approach to grpahical (visual) data analysis. Discussion will include topics such as issues in genotype-by-environment interaction and how GGE biplot methodology help analyze genotype-by-trait data, QTL data, and diallel and host-pathogen interaction data. The information presented should greatly enhance researchers' ability to understand and interpret their data.

Todd R. Klaenhammer

Eat Bacteria - Get Cultured: New Horizons in Bioprocessing and Health (G)
From Pasteur to Genomics (P)

Since the time of Louis Pasteur, the field of microbiology has exploded with the realization that billions of microorganisms inhabit our biosphere, our food, and our bodies. Most are non-pathogenic and exist unrecognized in a variety of niches where they often provide important and beneficial roles. The lactic acid bacteria, which have been used traditionally for thousands of years to ferment food, wine and dairy products are now being exploited to deliver a variety of health benefits to humans and animals. These bacteria are widely recognized as safe for oral consumption and are now providing unique opportunities to expand our horizons in bioprocessing and health.

Bruce A. Macher

Proteomics and Biomarkers (G)
The development of new mass spectrometric methods, and software able to link data output from mass spectrometers to protein databases has provided the potential to identify new biomarkers. However, identification of candidate biomarkers is only the first step in the process. After several years of technology development, it has become apparent that success in identifying and validating biomarkers requires a new approach that is being addressed by funding initiatives developed by researchers and funding agencies, particularly the National Cancer Institute. This lecture is designed to provide an overview of the technology being applied for biomarker discovery and the approaches being developed to ensure the candidate biomarkers are validated and clinical assays developed.

Developing Postdoctoral Scholar Training Programs (P,G)
Most postdoctoral experiences are solely research focused and ignore the need for structured experiences in teaching and career development. Postdoctoral experiences are generally determined and driven by individual research mentors, not institutional policies or programs; thus they vary dramatically in scope, content, and quality. The lack of consistency and the resultant under preparation of postdoctoral scholars for faculty positions negatively impacts not only the postdoctoral scholars, but also the students they teach and mentor when they become assistant professors. Thus, there is a significant need for a structured program in which postdoctoral scholars could participate just prior to searching for a faculty position-a time at which they are optimally prepared mentally and academically to obtain the greatest benefit. In this lecture, I will review two programs that have been developed in collaboration with colleagues at the University of California campuses at Davis and San Francisco. A description of the program components, lessoned learned, and results from evaluations conducted will be presented. A product of these postdoctoral scholars programs is a set of web modules designed to provide and orientation to college teaching (http://oct.sfsu.edu/).

Gary S. May

Intelligent Semiconductor Manufacturing (S)
Recent innovations in the field of artificial intelligence, including neural networks, genetic algorithms, and expert systems have the potential to revolutionize the multi-billion dollar semiconductor manufacturing industry. Research at Georgia Tech is a leading contributor to intelligent semiconductor manufacturing.

Diversifying the Engineering Workforce (P)
This talk examines the various factors that contribute to the success of minority students in engineering programs by exploring past and current paradigms promoting success and analyzing models for advancing the participation of members of these populations. Student success is correlated to several indicators, including pre-college preparation, recruitment programs, admissions policies, financial assistance, academic intervention programs, and graduate school preparation and admission. This review suggests that the problem of minority underrepresentation and success in engineering is soluble given the appropriate resources and collective national "will" to propagate effective approaches.

Jon McCammond

Mission Impossible: Learning from what Cannot be Done (P,G)
Many of the most historically famous open problems in mathematics were finally resolved when someone came along and was able to prove, once and for all, that they simply could not be done. This talk consists of a layman's tour through some subset of the following list of such instances: the irrationality of the square root of 2, Euclid's parallel postulate, trisecting an angle, doubling a cube, solving a fifth degree polynomial, sizes of infinite, axiomatizing arithmetic, the halting problem, Goedel's incompleteness theorem and general issues of decidability. The level of this talk can be easily adjusted according to the level of the audience in attendance.

Wallpaper Patterns and Platonic Solids: Understanding the Structure of Symmetries (P,G)
Symmetry is an absolutely fundamental concept that lies at the heart of many key scientific principles. In this talk I will discuss how a mathematician might go about classifying the symmetries possessed by a 2 or 3 dimensional object such as a molecule or a physical system, leading up to an explanation of the somewhat cryptic claim that there are exactly 17 different types of wallpaper.

Roots, Ratios and Ramanujan: Finding Surprises Through Repetition (G,S)
When things get iterated over and over, something eventually has to give. Like a two-year-old who has discovered the word "why" mathematicians are often fascinated with the results of repetition. This talk will focus on a connected set of surprises that arise through simple iteration: punching the [cos] button repeatedly on a calculator, continued fractions -- along with their connections to the golden ratio and the Fibonacci numbers, and continued square roots, with some mentions of Ramanujan, Chebyshev polynomials, and the Mandelbrot set thrown in along the way.

Philip A. Meyers

How Did Petroleum Source Rocks Accumulate? Insights from Deep-Sea Sediments (P,G,S)
Petroleum originates from organic-carbon-rich source rocks. Formation of these peculiar rocks requires the unusual combination of high marine biological productivity and survival of a remarkably large fraction of the resulting organic matter while it settles to the sea floor and becomes buried. These essential requirements are rare in todayĂs world, and they have similarly been rare through much of geologic time. However, widespread accumulations of petroleum source rocks have occurred in multi-million-year periods in the past, which indicates that world conditions of the past were sometimes quite different from those of today. This lecture will present and evaluate geochemical evidence of how paleoclimatic and paleoceanographic conditions were different during times of extensive deposition of petroleum source rocks using examples from mid-Cretaceous marine sediments and Mediterranean sapropels, and it will emphasize why petroleum is a limited resource.

Marine Upwelling Systems: Their Oceanography and Paleoceanography (P,G,S)
Important features of the modern ocean are its large coastal upwelling systems. These systems are associated with the eastern boundary currents of the Atlantic and Pacific Oceans and with the monsoons of the Indian Ocean, and they support the major fisheries of these oceans. Wind-driven upwelling brings nutrients from intermediate waters into the photic zone to fuel high rates of algal primary production that convert dissolved inorganic carbon into abundant amounts of organic matter, which in turn lead to accumulation of organic carbon-rich sediments on the underlying seafloor. The burial of marine organic matter is an important component of the global carbon cycle that removes carbon from fast turnover in the biosphere-atmosphere-hydrosphere system and transfers it into the slowly recycling geosphere. However, the high productivity that is typical of today's upwelling systems is a relatively new phenomenon. Organic-carbon-rich upwelling sediments first started to accumulate within the last 15 million years in the Pacific Ocean and less than 10 million years ago in the Atlantic and Indian Oceans. This presentation will give an overview of the oceanographic basics of upwelling systems, discuss the paleoceanographic factors that participated in their different evolutions over geologic time, and consider possible future consequences of over-fishing and climate change.

The Great Lakes of North America and Their Sedimentary Histories of Human Impacts (P,G,S)
Lakes Superior, Huron, Michigan, Saint Clair, Erie, and Ontario constitute the Laurentian Great Lakes. Sedimentary records in the Great Lakes date from retreat of the Laurentide ice sheet from the lake basins about 12,000 years ago. Following retreat of the glaciers, land ecosystems in the Great Lakes region began a progression from tundra to boreal forests to mixed temperate forests dominated by deciduous trees in the south and coniferous ones in the north. After the first permanent Europeans colonies were established around 1700, the pace of environmental change increased, and it especially accelerated after completion of the Erie Canal in 1825. Forests were clear-cut and replaced by farmland, towns and cities grew, heavy industry became established, and some parts of the lakes evolved from oligotrophic to eutrophic as nutrient loadings increased. Evidence of these changes is recorded in components of the lake sediments that were deposited at the various stages of the paleoenvironmental history of the region. This presentation will focus on the sedimentary records of the last two centuries and the evidence of human impacts on the Great Lakes.

Naomi Miller

Past, Present and Future of the Landscape in the Land of King Midas: Gordion, Turkey (P, G) Gordion was the capital of ancient Phrygia and reputed home of King Midas (c. 700 B.C.). Its monuments include the Midas Mound (Tumulus MM), nearly a hundred smaller burial mounds, and the ancient city of Gordion itself. Studies of the modern forest and steppe vegetation suggest how ancient people may have used different areas within the landscape. Plant remains recovered from the settlement document changes in vegetation and land use from about 1200 BC to 1000 AD. Inspired by our glimpses of formerly healthy steppe vegations, the Gordion Project is attempting to create a solid cover of grasses and flowers on the Midas Mound to stem erosion.

Has It Always Looked Like This? Long-term Vegetation Changes in the Near East (G) Vegetation responds to climate and human activity. In west Asia, climate has been implicated in both agricultural origins and a late third millennium collapse of civilization.The problem for archaeologists is determining when a shift occurred and whether it was sufficient to affect (positively or negatively) the established cultural response to normal annual and interannual variability. Examples from west Asia illustrate that when climate is reconstructed from proxy data, human impact on the vegetation may swamp whatever evidence for climate change there is. This talk explains how archaeology can inform our understanding of long-term human impact on the land.

People and Plants: The Present as Key to the Past, Ethnoarchaeology in an Iranian Village (G, S) In order to connect the traces of the past to a plausible understanding of ancient lifeways, archaeologists use "ethnographic analogy." Some analogies are strong: in highland Iran, the best time to plant wheat is the fall, so ancient wheat was most likely planted that time of year; the span of a roof will depend on the length of available beams. In pre-revolutionary Iran, the village of Malyan was tied into the national and global economy. Nevertheless, the seasonal rhythms of life and many aspects of the built environment are strongly related to the past as uncovered through archaeology. The talk is based on my fieldwork during the 1970s, and explains how I used analogy to interpret ancient plant remains.

Albert J. Paul

Increasing the Accuracy of Data for Compound Semiconductors (S)
The National Institute of Standards and Technology has produced the first high-accuracy composition standard reference material for a ternary III-V compound semiconductor. Photoluminescence spectroscopy(PL) was used to measure the composition-dependent emission energies of Al(x)Ga(x-1)As compound semiconductor films grown by molecular beam epitaxy on GaAs substrates. Results were compared with four independent methods used for measuring composition; reflective high-energy diffraction(RHEED), wavelength-dispersive x-ray spectroscopy(WDS), electron microprobe analysis (EMPA), and inductively coupled plasma optical-electron spectroscopy(ICP-OES). We also look at other factors that can cause shifts in emission peaks such as ambient temperature drifts, spatial inhomogeneity, and uncertainty in the wavelength scale.

PL and microRaman spectroscopic techniques were also used to measure respectively, the emission energies and vibrational modes of GaAs substrates Al(x)Ga(x-1)As thin films as a function of applied stress. A load cell was constructed in order to apply a calibrated biaxial stress to 11mm specimens that were cut from 75mm diameter wafers. We examined the peak shifts for five Al mole fractions which included those with both direct and indirect bandgaps.

Stress data is essential for the design and modeling of semiconductor devices. We demonstrate how the band-structure of materials can change from their stress-free measured film properties when attached to surfaces.

Measuring the Stress in Compound Semiconductors (G)
In recent years, strained materials have become increasingly important to the semiconductor industry. They exhibit larger electron mobilities that result in greater power handling capabilities. Raman and photoluminescence spectroscopies are two non-destructive technigues that can be used to measure the effects of strain on band structure and properties of thin films. Micro-Raman and photoluminescence spectra of GaAs substrates and Al(x)Ga(x-1)As/GaAs films were obtained as a function of calibrated applied stress. The specimens were placed under a variable load with the construction of a load cell. Vibrational modes were excited with the 488nm line of an Ar+ laser. Data was obtained as a function of incident and scattered light polarizations. Photoluminescence measurements were made under the same conditions and on the same specimens used in the Raman measurements. Commercial software was used to fit peak positions of Raman and PL data. The goal of this work is to simultaneously evaluate the stress and composition of Al(x)Ga(x-1)As films.

Vito Quartana

Integrating Multiscale Data for Simulating Cancer Invasion and Metastasis (G,S)
Cancer research has undergone radical changes in recent times. Producing information both at the basic and clinical levels is no longer the issue. Rather, how to handle this information has become the major obstacle to progress. Intuitive approaches are no longer feasible. The next big step will be to implement mathematical modeling approaches to interrogate the enormous amount of data being produced, and extract useful answer. Quantitative simulation of clinically relevant cancer situations, based on experimentally validated mathematical modeling, provides an opportunity for the researcher and eventually the clinician to address data and information in the context of well formulated questions and ˘what if÷ scenarios. At the Vanderbilt Integrative Cancer Biology Center (http://www.vanderbilt.edu/VICBC/) we are implementing a vision for a web site that will serve as a cancer simulational hub. To this end, we are combining expertise of an interdisciplinary group of scientist, including experimental biologists, clinical oncologists, chemical and biological engineers, computational biologists, computer modelers, theoretical and applied mathematicians and imaging scientists.

Currently, the major focus of our Center is to produce quantitative computer simulations of cancer invasion at a multiplicity of biological scales. We have several strategies for data collection and modeling approaches at each of several scales, including the cellular (100 cells) multicellular (<102 cells) and tissue level (<106-108 cells).

For the cellular scale, simulation of a single cell moving in an extracellular matrix field is being parameterized with data from lamellipodia protrusion, cell speed, haptotaxis. Some of these data are being collected in novel bioengineered gadgets.

For the multicellular scale, we have adopted the MCF10A 3dimensional mammosphere system. Several parameters, including proliferation, apoptosis, cell-cell adhesion, are being fed into a mathematical model that simulates mammosphere morphogenesis and realistically takes into account cell mechanical properties.

At the tissue level, our hybrid discrete-continuous mathematical model can predict tumor fingering based on individual cell properties. Therefore, we are parameterizing the hybrid model with data from the cellular and multicellular scales and are validating the model by in vivo imaging of tumor formation.

Understanding Life by Data Integration at Multiple Scales (P,G)
This lecture explains how the best way to understand life is by organizing the enormous amount of biological data into a continuum of scales, from molecule, to subcellular organelles, to cells, tissues, organs and organisms. Examples are given for how this is being done in cancer biology, which is becoming more and more rooted in a comprehensive, systems approach.

Biology Becomes an Exact Science (P,G)
This lecture focuses on the emergence of a new type of biomedical scientist, either experimental biologist or physician, who is fluent in the language of both mathematics and biology. It shows, with concrete examples mostly from cancer research, how biology is undergoing a fast transition akin to the transformation of alchemy into chemistry, and is adopting more and more the tools and mindset of mathematics or engineering. Considerable creative input will be necessary to finalize this transition.

Sue V. Rosser

The Science Glass Ceiling: Academic Women Scientists and their Struggle to Succeed
The Science Glass Celing explores the experiences of women science and engineering faculty in universities across America. Responses of 450 women scientists and engineers to e-mail questionnaires reveal the obstacles, barriers, as well as the encouragements the women face from their colleagues and institutions. Respondents were recipients of either the NSF POWRE award or the Clare Booth Luce Professor Awards. In-depth interviews with 50 of these individuals, representing some of the country's top female scientists about their research, love of science, and daily life in the laboratory suggest solutions to some of the obstacles and policy changes that might transform the cultures at both small liberal arts colleges and larger research institutions to enhance their careers.

Transforming Institutions through ADVANCE
Principal investigators sow the seeds for successful institutionalization and sustainability of their ADVANCE grants when they make the decision to submit the grant and plan the goals, objectives, and activities underpinning the particular aspects of institutional transformation that their university will pursue within a gneeral framework to advance faculty women to senior and leadership positions. Receing the NSF funding in a very competitive, peer-reviewed program and the relatively large size of the NSF grants carry considerable prestige. It is the institutional investment in terms of both human and capital resources and commitment on the parts of administrators and faculty to establish, change, and implement policies and practices to support ADVANCE that leverages the NSF support and assures long-term impact of the initiative.

Veronica Vaida

Solar Energy and the Environment (G)
Sun-light in Atmospheric Chemistry and Climate (S)
Water Aggregates in the Earth's Contemporary and Prebiotic Atmosphere (G)

Cost competitive carbon-neutral energy is needed to reduce the anthropogenic impact on global climate, provide energy security for the United States, and enable a rise in the world's standard of living. For environmental, geopolitical, and health care reasons, it can be argued that finding and implementing such energy resources is the single most important challenge facing the world in the 20th century. This global challenge has interdisciplinary elements ranging from chemical, physical and environmental sciences, to international law, to studies of the role of existing infrastructure and consumer habits. As the world's population increases in the next 50 years concomitant with economic growth in previously undeveloped countries, new methods must be discovered for converting sunlight, biomass, wind, geothermal energy, and nuclear fuels to useful and transportable energy resources. Specifically, solar photochemistry will be discussed as an example of renewable energy with minimal environmental impacts

This presentation focuses on environmental effects of energy consumption. Chemical issues related to the earth temperature and climate will be discussed in an attempt to build bridges between research in energy sciences and environmental sciences.

Roger White

Living in Nature: Integrating the Social and Environmental Sciences in Computer Based Models (G,S)
Many of our most pressing problems are the result of the way society interacts with the natural environment. But a functional integration of the social and natural sciences has remained elusive. Part of the problem is that the social sciences, especially sociology and economics, are largely a-spatial (think of the treatment of market equilibrium by economists), while problems in the natural environment are typically contingent on spatial relations (habitat fragmentation is an example). High resolution models of land cover and land use provide a very functional way of linking the two scientific domains. For example, a demographic model predicts an increase in population in a region. Linking that model to a dynamic land use model allows the latter to show which specific parcels of land are likely to be developed and when; thus a prediction of the future course of habitat fragmentation can be generated. In general, the land use model can be seen as a platform through which models from various disciplines can be linked, so that indirectly, each model is forced to respond to the constraints provided by the output of the others. Several cases illustrate the approach. For example, in The Netherlands, where most nature is not natural (it is not uncommon to see a billboard to the effect of Coming soon on this site: 25 ha of Nature) integrated models of the human and natural systems are being used to explore the future living environment of the Dutch people under alternative policy options. And in Italy, linked land use and hydrological models are being developed to examine the likely impact of future floods in growing urban regions.

Cities and Regions as Complex Self-Organizing Entities: From Dynamical Theory to Planning and Policy Support Tools with Cellular Automata Based Urban Models (P,G,S)
Cities and regions are highly complex but highly ordered entities in which the structure largely emerges spontaneously as a result of innumerable specific actions taken by individuals and organizations. The tools of complexity theory are allowing rapid advances in our understanding of the process by which this occurs. For example, a number of aspects of cities have been shown to have characteristic fractal dimensions, a signature of self-organization. Dynamic simulation models of urban structure, based on cellular automata, generate high resolution predictions of land use patterns characterized by the same fractal dimensions. These advances are of scientific interest in themselves, but they also have practical applications. The models are now being configured to permit planners to perform what-if experiments: what if a new expressway is built on a certain route what will be the impact on the city over the next ten or twenty-five years? What if a commuter rail line is built instead? What if nothing is done? Versions of these models are now being tested by the Spatial Plan Bureau for The Netherlands and the Institute for Environment and Sustainability of the Joint Research Centre of the European Commission. As smart growth policies are contemplated in a number of North American cities, these are the models that will permit the long term impact of various proposed plans to be visualized and analyzed.

High Resolution Prediction of Growth and Change in Urban Regions: Exploding Grid Cellular Automata with Power-Law Spatial Interaction (G,S)
Cities are highly structured, ever changing agglomerations of people and their economic and social activities. Recent years have seen the development of dynamic simulation models that capture local-scale processes. These models are based on cellular automata and reliably predict changing land use patterns. Other, more traditional models, formulated in terms of spatial interaction equations, are used to predict the changing distribution of people and economic activities among larger spatial units like counties; these macro-scale models are less successful. An exploding grid cellular automaton, in which the cell neighborhood includes the entire modeled area, but with lower resolution at increasing distances, makes it possible to include the long distance spatial interaction effects within the cellular framework without sacrificing the computational efficiency of the cellular approach. The result is a single model which is much simpler but gives better results than the two types of model it replaces, providing more accurate and reliable predictions of future spatial structure. The approach also yields a deeper insight into the processes that generate structure in urban regions.

Michael Wolf

How Nature Chooses its Shape: The Mathematics of Soap Films (G,S)
From the time of Leibniz's theological description of this world as the best of all possible worlds, mathematicians have struggled to explain why nature is shaped the way we find it. We introduce the mathematical field of the "calculus of variations", using soapy water experiments to illustrate the richness of examples possible in the mathematical subfield of minimal surfaces -- an aesthetically pleasing area explored nightly by small children in their bathtubs.

The Dreary Comfort of the Mathematics Curriculum (P)
Most high school students and undergraduates take mathematics courses. With a small amount of attention from the lecturer, these courses are inoffensive, relatively easy to take, quite easy to teach, and rather uninspiring -- all but the most passionate of incoming students with an interest in math eventually adopt different majors. Yet this situation certainly doesn't derive from the nature of mathematics, as several U.S. departments have enormous enrollments in their mathematics majors. We make several observations from our perspective of professor, department chair, and director of a mathematical research program for students and most recently, live-in advisor to 300 undergraduates.

Non-Euclidean Geometry and the Shapes of Space (G)
High school geometry is the Euclidean geometry of flat two-dimensional planes. This talk is an introduction to the rich world of the geometry of shapes that are not necessarily two-dimensional, are definitely not flat and are not planes -- yet still retain the appealing aesthetics of abundant symmetry present in Euclidean planes. There has been quite a lot of recent progress in understanding these shapes, and we briefly survey that work.


Back to top | Privacy Policy | Copyright ©2013. All Rights Reserved.