- Want to solve a complex problem? Applied math can help
- Inadequate compensation for lost or downgraded protected areas threatens global biodiversity: Study
- Only 5 women have won the Nobel Prize in physics—recent winners share advice for young women in the field
- Madagascar's mining rush has caused no more deforestation than farming, study finds
- Scientists explore microbial diversity in sourdough starters
TemeTV
Humanities in the Digital Age
Reports of the demise of the humanities are exaggerated, suggest these panelists, but there may be reason to fear its loss of relevance. Three scholars whose work touches a variety of disciplines and with wide knowledge of the worlds of academia and publishing ponder the meaning and mission of the humanities in the digital age.
Getting a handle on the term itself proves somewhat elusive. Alison Byerly invokes those fields involved with “pondering the deep questions of humanity,” such as languages, the arts, literature, philosophy and religion. Steven Pinker boils it down to “the study of the products of the human mind.” Moderator David Thorburn wonders if the humanities are those endeavors that rely on interpretive rather than empirical research, but both panelists vigorously make the case that the liberal arts offer increasing opportunities for data-based analysis.
Technology is opening up new avenues for humanities scholars. In general, Byerly notes, humanities “tend to privilege individual texts or products of the human mind, rather than collective wisdom or data.” More recently, online collections or data bases of text, art and music make possible wholly different frameworks for study. Pinker cites his own use of automated text analysis in Google books to research the history of violence, tracing the rise and fall of such words as “glorious” and “honorable” -- connected in times past with nations’ war-making. Humanities scholars could routinely deploy tools like this to strengthen argument and interpretation, says Pinker, allowing them “to say things are warranted, true, coherent.”
Humanists are adopting new tools and methods for teaching and publishing as well. Byerly describes the freedom afforded her as a professor of Victorian literature when she can direct students to specific interactive websites for historical and cultural background, allowing her to focus on a specific novel in class. As Middlebury provost, she has been broadening the concept of publication to include work in different media online. However, as Pinker notes, the process for publishing articles in scholarly journals remains painfully slow: in experimental psychology, a “six year lag from having an idea to seeing it in print.” He suggests a “second look” at the process of peer review, perhaps publishing everything online, “and stuff that’s crummy sinks to the bottom as no one links to or reads it.” Pinker looks forward to a future where he no longer has to spend a “lot of time leafing through thick books” looking for text strings, or flipping to and from footnotes. “We could love books as much as we always have, but not necessarily confine ourselves to their limitations, which are just historical artifacts,” he says.
Such changes in the humanities may not come a moment too soon. In spite of relatively stable numbers of graduates in the U.S., the liberal arts may be increasingly endangered. Byerly sees “an inherent aura of remoteness about humanities: It studies the past, and distant past. At a time when technology seems to be speeding things up, bringing information to us faster, humanities’ pace doesn’t seem in tune with the times.” Pinker’s “nightmare scenario” is the “disaggregation of practical aspects of undergraduate education of students, and humanities,” akin to the way newspapers lost classified advertising. Humanities faculty, who tend not to bring in grants the way science faculty do, may prove an irresistible target for budget cutters. To protect these fields, Pinker proposes, integrate them with social sciences: connect English literature to the sciences of human nature, for instance, or music theory to auditory perception. “Make humanities’ faculty indispensable,” he urges.
Getting a handle on the term itself proves somewhat elusive. Alison Byerly invokes those fields involved with “pondering the deep questions of humanity,” such as languages, the arts, literature, philosophy and religion. Steven Pinker boils it down to “the study of the products of the human mind.” Moderator David Thorburn wonders if the humanities are those endeavors that rely on interpretive rather than empirical research, but both panelists vigorously make the case that the liberal arts offer increasing opportunities for data-based analysis.
Technology is opening up new avenues for humanities scholars. In general, Byerly notes, humanities “tend to privilege individual texts or products of the human mind, rather than collective wisdom or data.” More recently, online collections or data bases of text, art and music make possible wholly different frameworks for study. Pinker cites his own use of automated text analysis in Google books to research the history of violence, tracing the rise and fall of such words as “glorious” and “honorable” -- connected in times past with nations’ war-making. Humanities scholars could routinely deploy tools like this to strengthen argument and interpretation, says Pinker, allowing them “to say things are warranted, true, coherent.”
Humanists are adopting new tools and methods for teaching and publishing as well. Byerly describes the freedom afforded her as a professor of Victorian literature when she can direct students to specific interactive websites for historical and cultural background, allowing her to focus on a specific novel in class. As Middlebury provost, she has been broadening the concept of publication to include work in different media online. However, as Pinker notes, the process for publishing articles in scholarly journals remains painfully slow: in experimental psychology, a “six year lag from having an idea to seeing it in print.” He suggests a “second look” at the process of peer review, perhaps publishing everything online, “and stuff that’s crummy sinks to the bottom as no one links to or reads it.” Pinker looks forward to a future where he no longer has to spend a “lot of time leafing through thick books” looking for text strings, or flipping to and from footnotes. “We could love books as much as we always have, but not necessarily confine ourselves to their limitations, which are just historical artifacts,” he says.
Such changes in the humanities may not come a moment too soon. In spite of relatively stable numbers of graduates in the U.S., the liberal arts may be increasingly endangered. Byerly sees “an inherent aura of remoteness about humanities: It studies the past, and distant past. At a time when technology seems to be speeding things up, bringing information to us faster, humanities’ pace doesn’t seem in tune with the times.” Pinker’s “nightmare scenario” is the “disaggregation of practical aspects of undergraduate education of students, and humanities,” akin to the way newspapers lost classified advertising. Humanities faculty, who tend not to bring in grants the way science faculty do, may prove an irresistible target for budget cutters. To protect these fields, Pinker proposes, integrate them with social sciences: connect English literature to the sciences of human nature, for instance, or music theory to auditory perception. “Make humanities’ faculty indispensable,” he urges.
Categories: TemeTV
Humanities in the Digital Age
Reports of the demise of the humanities are exaggerated, suggest these panelists, but there may be reason to fear its loss of relevance. Three scholars whose work touches a variety of disciplines and with wide knowledge of the worlds of academia and publishing ponder the meaning and mission of the humanities in the digital age.
Getting a handle on the term itself proves somewhat elusive. Alison Byerly invokes those fields involved with “pondering the deep questions of humanity,” such as languages, the arts, literature, philosophy and religion. Steven Pinker boils it down to “the study of the products of the human mind.” Moderator David Thorburn wonders if the humanities are those endeavors that rely on interpretive rather than empirical research, but both panelists vigorously make the case that the liberal arts offer increasing opportunities for data-based analysis.
Technology is opening up new avenues for humanities scholars. In general, Byerly notes, humanities “tend to privilege individual texts or products of the human mind, rather than collective wisdom or data.” More recently, online collections or data bases of text, art and music make possible wholly different frameworks for study. Pinker cites his own use of automated text analysis in Google books to research the history of violence, tracing the rise and fall of such words as “glorious” and “honorable” -- connected in times past with nations’ war-making. Humanities scholars could routinely deploy tools like this to strengthen argument and interpretation, says Pinker, allowing them “to say things are warranted, true, coherent.”
Humanists are adopting new tools and methods for teaching and publishing as well. Byerly describes the freedom afforded her as a professor of Victorian literature when she can direct students to specific interactive websites for historical and cultural background, allowing her to focus on a specific novel in class. As Middlebury provost, she has been broadening the concept of publication to include work in different media online. However, as Pinker notes, the process for publishing articles in scholarly journals remains painfully slow: in experimental psychology, a “six year lag from having an idea to seeing it in print.” He suggests a “second look” at the process of peer review, perhaps publishing everything online, “and stuff that’s crummy sinks to the bottom as no one links to or reads it.” Pinker looks forward to a future where he no longer has to spend a “lot of time leafing through thick books” looking for text strings, or flipping to and from footnotes. “We could love books as much as we always have, but not necessarily confine ourselves to their limitations, which are just historical artifacts,” he says.
Such changes in the humanities may not come a moment too soon. In spite of relatively stable numbers of graduates in the U.S., the liberal arts may be increasingly endangered. Byerly sees “an inherent aura of remoteness about humanities: It studies the past, and distant past. At a time when technology seems to be speeding things up, bringing information to us faster, humanities’ pace doesn’t seem in tune with the times.” Pinker’s “nightmare scenario” is the “disaggregation of practical aspects of undergraduate education of students, and humanities,” akin to the way newspapers lost classified advertising. Humanities faculty, who tend not to bring in grants the way science faculty do, may prove an irresistible target for budget cutters. To protect these fields, Pinker proposes, integrate them with social sciences: connect English literature to the sciences of human nature, for instance, or music theory to auditory perception. “Make humanities’ faculty indispensable,” he urges.
Getting a handle on the term itself proves somewhat elusive. Alison Byerly invokes those fields involved with “pondering the deep questions of humanity,” such as languages, the arts, literature, philosophy and religion. Steven Pinker boils it down to “the study of the products of the human mind.” Moderator David Thorburn wonders if the humanities are those endeavors that rely on interpretive rather than empirical research, but both panelists vigorously make the case that the liberal arts offer increasing opportunities for data-based analysis.
Technology is opening up new avenues for humanities scholars. In general, Byerly notes, humanities “tend to privilege individual texts or products of the human mind, rather than collective wisdom or data.” More recently, online collections or data bases of text, art and music make possible wholly different frameworks for study. Pinker cites his own use of automated text analysis in Google books to research the history of violence, tracing the rise and fall of such words as “glorious” and “honorable” -- connected in times past with nations’ war-making. Humanities scholars could routinely deploy tools like this to strengthen argument and interpretation, says Pinker, allowing them “to say things are warranted, true, coherent.”
Humanists are adopting new tools and methods for teaching and publishing as well. Byerly describes the freedom afforded her as a professor of Victorian literature when she can direct students to specific interactive websites for historical and cultural background, allowing her to focus on a specific novel in class. As Middlebury provost, she has been broadening the concept of publication to include work in different media online. However, as Pinker notes, the process for publishing articles in scholarly journals remains painfully slow: in experimental psychology, a “six year lag from having an idea to seeing it in print.” He suggests a “second look” at the process of peer review, perhaps publishing everything online, “and stuff that’s crummy sinks to the bottom as no one links to or reads it.” Pinker looks forward to a future where he no longer has to spend a “lot of time leafing through thick books” looking for text strings, or flipping to and from footnotes. “We could love books as much as we always have, but not necessarily confine ourselves to their limitations, which are just historical artifacts,” he says.
Such changes in the humanities may not come a moment too soon. In spite of relatively stable numbers of graduates in the U.S., the liberal arts may be increasingly endangered. Byerly sees “an inherent aura of remoteness about humanities: It studies the past, and distant past. At a time when technology seems to be speeding things up, bringing information to us faster, humanities’ pace doesn’t seem in tune with the times.” Pinker’s “nightmare scenario” is the “disaggregation of practical aspects of undergraduate education of students, and humanities,” akin to the way newspapers lost classified advertising. Humanities faculty, who tend not to bring in grants the way science faculty do, may prove an irresistible target for budget cutters. To protect these fields, Pinker proposes, integrate them with social sciences: connect English literature to the sciences of human nature, for instance, or music theory to auditory perception. “Make humanities’ faculty indispensable,” he urges.
Categories: TemeTV
Negotiating the Gulf Disaster
The Gulf Oil spill hurt many individuals and businesses, and there is broad agreement that they deserve compensation. But working out the nuances of damage payment is no simple matter, as Lawrence Susskind describes in conversation with an MIT Museum audience.
The $20 billion Oil Spill Compensation Fund, created by BP at the behest of the president, seemed a beneficial alternative to the endless litigation following the Exxon Valdez disaster. But even with the fund’s experienced “paymaster,” Ken Feinberg (who managed 9/11 claims), this approach to compensation is a “big experiment,” says Susskind. “We’re not used to doing it this way, at this scale.”
Susskind outlines and then raises several questions about the claims process. Feinberg has solicited documentation from victims that will demonstrate loss of income during the period of the spill. He has promised to write checks as fast as possible. But Susskind wonders if Feinberg, with his staff of 25, can sort through and reasonably assess the mountains of material claimants send in. Does the money go to those most in need, who have lost mortgages on their fishing boats or homes, or to condo developers in Florida who cannot sell their properties? Susskind wants to understand Feinberg’s “philosophical stand in regard to fairness.”
“Timing matters” as well. Susskind wonders about the tradeoffs between getting money to people quickly, and the need to examine claims with the kind of care required to avoid exploitation of the system. Should Feinberg “lower the standard of proof” to help people in urgent need? Finally, Susskind worries about the wisdom of separating compensation payments “from the issue of fact-finding with regard to fault, long-term environmental restoration, and punitive payments aimed at changing behaviors to avoid future accidents.” BP’s $20 billion payout does not really punish the corporation, which earns this much money in three months, says Susskind. So if the compensation plan does not help correct BP’s “risky behavior,” and send a message to the entire industry, what is the best means to avoid another Deepwater Horizon?
Susskind proposes one remedy to prevent continued slipshod practices in the offshore oil and gas industry. He cites the creation of the non-profit Institute for Nuclear Power Operations after the Three Mile Island accident, which the nuclear power industry developed “to police itself, so bad actors became the industry’s responsibility, not just the government’s.” Sloppy or unsafe nuclear plants can lose their insurance, and are “out of business,” says Susskind. Why not “take this analogy and apply it to offshore oil and gas,” he suggests.
The $20 billion Oil Spill Compensation Fund, created by BP at the behest of the president, seemed a beneficial alternative to the endless litigation following the Exxon Valdez disaster. But even with the fund’s experienced “paymaster,” Ken Feinberg (who managed 9/11 claims), this approach to compensation is a “big experiment,” says Susskind. “We’re not used to doing it this way, at this scale.”
Susskind outlines and then raises several questions about the claims process. Feinberg has solicited documentation from victims that will demonstrate loss of income during the period of the spill. He has promised to write checks as fast as possible. But Susskind wonders if Feinberg, with his staff of 25, can sort through and reasonably assess the mountains of material claimants send in. Does the money go to those most in need, who have lost mortgages on their fishing boats or homes, or to condo developers in Florida who cannot sell their properties? Susskind wants to understand Feinberg’s “philosophical stand in regard to fairness.”
“Timing matters” as well. Susskind wonders about the tradeoffs between getting money to people quickly, and the need to examine claims with the kind of care required to avoid exploitation of the system. Should Feinberg “lower the standard of proof” to help people in urgent need? Finally, Susskind worries about the wisdom of separating compensation payments “from the issue of fact-finding with regard to fault, long-term environmental restoration, and punitive payments aimed at changing behaviors to avoid future accidents.” BP’s $20 billion payout does not really punish the corporation, which earns this much money in three months, says Susskind. So if the compensation plan does not help correct BP’s “risky behavior,” and send a message to the entire industry, what is the best means to avoid another Deepwater Horizon?
Susskind proposes one remedy to prevent continued slipshod practices in the offshore oil and gas industry. He cites the creation of the non-profit Institute for Nuclear Power Operations after the Three Mile Island accident, which the nuclear power industry developed “to police itself, so bad actors became the industry’s responsibility, not just the government’s.” Sloppy or unsafe nuclear plants can lose their insurance, and are “out of business,” says Susskind. Why not “take this analogy and apply it to offshore oil and gas,” he suggests.
Categories: TemeTV
Toward Efficient Airport Operations
Few of us would elect to spend countless hours at the airport watching planes arrive, depart and sit at gates. But what constitutes a punishment for some actually energizes Hamsa Balakrishnan, whose research focuses on improving airport operations. Her goal is to make air travel more efficient, robust and green.
Flying is quite frequently a trial these days, Balakrishnan acknowledges, with delays growing yearly, even during the recession when actual flights decreased. Congestion at the nation’s busiest airports is primarily responsible for these delays, which produce billion-dollar losses for the airlines, environmental damage as idling planes burn millions of gallons of fuel, and untold aggravation for passengers. If gridlock at these airports is not addressed, these problems will only worsen, says Balakrishnan, with the projected doubling by 2025 of the nation’s approximately 35 thousand daily flights.
With a team of students, Balakrishnan has been analyzing airport operations. Air traffic controllers must routinely separate plane landings by a few minutes, and balance the need for safety with maximum efficiency. With departures, controllers attempt to respond to pilots on a first come, first served basis, but must pause for arrivals if runways are busy, and must juggle take-off order if planes are due at other airports. The current model for scheduling, called constrained position shifting, says Balakrishnan, has “been conjectured to have exponential computational complexity,” and most important, does not seem the optimal method for controllers dealing with busy, real-time conditions.
Balakrishnan has recently broken through conventional scheduling complexities. Her approach involves developing simple, practical algorithms that improve takeoff and landing efficiency while factoring in typical aircraft arrival and departure protocol, and weather, among other factors. She is now testing her own scheduling models at Boston’s Logan Airport, at rush hour. So far, her team has achieved improvements in “runway throughput” equivalent to two-three extra flights per hour -- a 10-12% improvement in average flight delay.
She is also working on reducing the amount of time planes spend waiting in departure queues burning fuel, a phenomenon resulting from saturation in ground traffic. In tests with Boston controllers, her team used color-coded cards to signal when planes should actually push back from the gate and fire up their engines. By manipulating pushback rates, says Balakrishnan, you can significantly decrease the amount of fuel burned, reducing CO2 and particulate release. Controllers also felt things “flowed better,” she says. Next steps include a comprehensive evaluation of benefits, with an eye to developing “scalable control and optimization algorithms” for an increasingly busy aviation system.
Flying is quite frequently a trial these days, Balakrishnan acknowledges, with delays growing yearly, even during the recession when actual flights decreased. Congestion at the nation’s busiest airports is primarily responsible for these delays, which produce billion-dollar losses for the airlines, environmental damage as idling planes burn millions of gallons of fuel, and untold aggravation for passengers. If gridlock at these airports is not addressed, these problems will only worsen, says Balakrishnan, with the projected doubling by 2025 of the nation’s approximately 35 thousand daily flights.
With a team of students, Balakrishnan has been analyzing airport operations. Air traffic controllers must routinely separate plane landings by a few minutes, and balance the need for safety with maximum efficiency. With departures, controllers attempt to respond to pilots on a first come, first served basis, but must pause for arrivals if runways are busy, and must juggle take-off order if planes are due at other airports. The current model for scheduling, called constrained position shifting, says Balakrishnan, has “been conjectured to have exponential computational complexity,” and most important, does not seem the optimal method for controllers dealing with busy, real-time conditions.
Balakrishnan has recently broken through conventional scheduling complexities. Her approach involves developing simple, practical algorithms that improve takeoff and landing efficiency while factoring in typical aircraft arrival and departure protocol, and weather, among other factors. She is now testing her own scheduling models at Boston’s Logan Airport, at rush hour. So far, her team has achieved improvements in “runway throughput” equivalent to two-three extra flights per hour -- a 10-12% improvement in average flight delay.
She is also working on reducing the amount of time planes spend waiting in departure queues burning fuel, a phenomenon resulting from saturation in ground traffic. In tests with Boston controllers, her team used color-coded cards to signal when planes should actually push back from the gate and fire up their engines. By manipulating pushback rates, says Balakrishnan, you can significantly decrease the amount of fuel burned, reducing CO2 and particulate release. Controllers also felt things “flowed better,” she says. Next steps include a comprehensive evaluation of benefits, with an eye to developing “scalable control and optimization algorithms” for an increasingly busy aviation system.
Categories: TemeTV
Toward Efficient Airport Operations
Few of us would elect to spend countless hours at the airport watching planes arrive, depart and sit at gates. But what constitutes a punishment for some actually energizes Hamsa Balakrishnan, whose research focuses on improving airport operations. Her goal is to make air travel more efficient, robust and green.
Flying is quite frequently a trial these days, Balakrishnan acknowledges, with delays growing yearly, even during the recession when actual flights decreased. Congestion at the nation’s busiest airports is primarily responsible for these delays, which produce billion-dollar losses for the airlines, environmental damage as idling planes burn millions of gallons of fuel, and untold aggravation for passengers. If gridlock at these airports is not addressed, these problems will only worsen, says Balakrishnan, with the projected doubling by 2025 of the nation’s approximately 35 thousand daily flights.
With a team of students, Balakrishnan has been analyzing airport operations. Air traffic controllers must routinely separate plane landings by a few minutes, and balance the need for safety with maximum efficiency. With departures, controllers attempt to respond to pilots on a first come, first served basis, but must pause for arrivals if runways are busy, and must juggle take-off order if planes are due at other airports. The current model for scheduling, called constrained position shifting, says Balakrishnan, has “been conjectured to have exponential computational complexity,” and most important, does not seem the optimal method for controllers dealing with busy, real-time conditions.
Balakrishnan has recently broken through conventional scheduling complexities. Her approach involves developing simple, practical algorithms that improve takeoff and landing efficiency while factoring in typical aircraft arrival and departure protocol, and weather, among other factors. She is now testing her own scheduling models at Boston’s Logan Airport, at rush hour. So far, her team has achieved improvements in “runway throughput” equivalent to two-three extra flights per hour -- a 10-12% improvement in average flight delay.
She is also working on reducing the amount of time planes spend waiting in departure queues burning fuel, a phenomenon resulting from saturation in ground traffic. In tests with Boston controllers, her team used color-coded cards to signal when planes should actually push back from the gate and fire up their engines. By manipulating pushback rates, says Balakrishnan, you can significantly decrease the amount of fuel burned, reducing CO2 and particulate release. Controllers also felt things “flowed better,” she says. Next steps include a comprehensive evaluation of benefits, with an eye to developing “scalable control and optimization algorithms” for an increasingly busy aviation system.
Flying is quite frequently a trial these days, Balakrishnan acknowledges, with delays growing yearly, even during the recession when actual flights decreased. Congestion at the nation’s busiest airports is primarily responsible for these delays, which produce billion-dollar losses for the airlines, environmental damage as idling planes burn millions of gallons of fuel, and untold aggravation for passengers. If gridlock at these airports is not addressed, these problems will only worsen, says Balakrishnan, with the projected doubling by 2025 of the nation’s approximately 35 thousand daily flights.
With a team of students, Balakrishnan has been analyzing airport operations. Air traffic controllers must routinely separate plane landings by a few minutes, and balance the need for safety with maximum efficiency. With departures, controllers attempt to respond to pilots on a first come, first served basis, but must pause for arrivals if runways are busy, and must juggle take-off order if planes are due at other airports. The current model for scheduling, called constrained position shifting, says Balakrishnan, has “been conjectured to have exponential computational complexity,” and most important, does not seem the optimal method for controllers dealing with busy, real-time conditions.
Balakrishnan has recently broken through conventional scheduling complexities. Her approach involves developing simple, practical algorithms that improve takeoff and landing efficiency while factoring in typical aircraft arrival and departure protocol, and weather, among other factors. She is now testing her own scheduling models at Boston’s Logan Airport, at rush hour. So far, her team has achieved improvements in “runway throughput” equivalent to two-three extra flights per hour -- a 10-12% improvement in average flight delay.
She is also working on reducing the amount of time planes spend waiting in departure queues burning fuel, a phenomenon resulting from saturation in ground traffic. In tests with Boston controllers, her team used color-coded cards to signal when planes should actually push back from the gate and fire up their engines. By manipulating pushback rates, says Balakrishnan, you can significantly decrease the amount of fuel burned, reducing CO2 and particulate release. Controllers also felt things “flowed better,” she says. Next steps include a comprehensive evaluation of benefits, with an eye to developing “scalable control and optimization algorithms” for an increasingly busy aviation system.
Categories: TemeTV
Rebuilding Haiti
Difficult as it is to look beyond the acute misery of Haiti’s current crisis, Paul Farmer proposes that aid agencies and others concerned with rebuilding focus on the nation’s “old, chronic problems.” There’s no shortage of recovery ideas, he says, but these will go nowhere if they do not also advance the long-neglected, basic rights of Haitians.
Farmer describes efforts to respond to Haiti’s disastrous earthquake of January 2010, which killed hundreds of thousands, left 1.3 million homeless and much of the capital in ruins. Today, nearly a year later, the generous pledges of international aid have yet to materialize, says Farmer, and the peril has expanded to include a cholera outbreak. This picture is all the bleaker for the deaths of many of Farmer’s collaborators. The earthquake destroyed invaluable “human infrastructure”, says Farmer, including all the nursing students at Haiti’s one public nursing school.
Farmer has been working in Haiti for more than a decade, attempting to address not just malnutrition, HIV and tuberculosis, but larger issues such as Haitians’ lack of access to clean water, public education and healthcare. He would like to see international aid groups and foreign powers involved with Haiti recognize these issues in a meaningful way. Farmer’s long-standing strategy has been to engage Haiti’s public sector, or what remains after years of military and U.S. proxy rule, in the fight for these rights. He says, “There is always a role for the promotion of basic rights…The question is how to do this in the field, not just win an argument in seminar.”
The earthquake has profoundly deepened Haiti’s need for essential public institutions. The 1,000-plus tent cities housing more than a million people in Port-au-Prince are swelling, not diminishing, because people cannot find potable water anywhere else, and most have no idea where their next meal will come from. Yet there is a push to expel people from their tents and tarps, says Farmer, as if that will somehow speed construction of more permanent residences. Many plans are afoot for such housing, he says -- but few that take into account the desires of Haitians, who should have agency in shaping their own future. Rebuilding Haiti, Farmer believes, means “rebuilding aid machinery which is very broken, and often a damaging thing.” He is forging new alliances among Haitians and other aid partners, including Cubans and evangelical groups from the U.S., around water projects, and a new hospital that will be “big, green and public.” Says Farmer, “We must make common cause with those seeking to provide basic rights.”
Farmer describes efforts to respond to Haiti’s disastrous earthquake of January 2010, which killed hundreds of thousands, left 1.3 million homeless and much of the capital in ruins. Today, nearly a year later, the generous pledges of international aid have yet to materialize, says Farmer, and the peril has expanded to include a cholera outbreak. This picture is all the bleaker for the deaths of many of Farmer’s collaborators. The earthquake destroyed invaluable “human infrastructure”, says Farmer, including all the nursing students at Haiti’s one public nursing school.
Farmer has been working in Haiti for more than a decade, attempting to address not just malnutrition, HIV and tuberculosis, but larger issues such as Haitians’ lack of access to clean water, public education and healthcare. He would like to see international aid groups and foreign powers involved with Haiti recognize these issues in a meaningful way. Farmer’s long-standing strategy has been to engage Haiti’s public sector, or what remains after years of military and U.S. proxy rule, in the fight for these rights. He says, “There is always a role for the promotion of basic rights…The question is how to do this in the field, not just win an argument in seminar.”
The earthquake has profoundly deepened Haiti’s need for essential public institutions. The 1,000-plus tent cities housing more than a million people in Port-au-Prince are swelling, not diminishing, because people cannot find potable water anywhere else, and most have no idea where their next meal will come from. Yet there is a push to expel people from their tents and tarps, says Farmer, as if that will somehow speed construction of more permanent residences. Many plans are afoot for such housing, he says -- but few that take into account the desires of Haitians, who should have agency in shaping their own future. Rebuilding Haiti, Farmer believes, means “rebuilding aid machinery which is very broken, and often a damaging thing.” He is forging new alliances among Haitians and other aid partners, including Cubans and evangelical groups from the U.S., around water projects, and a new hospital that will be “big, green and public.” Says Farmer, “We must make common cause with those seeking to provide basic rights.”
Categories: TemeTV
Rebuilding Haiti
Difficult as it is to look beyond the acute misery of Haiti’s current crisis, Paul Farmer proposes that aid agencies and others concerned with rebuilding focus on the nation’s “old, chronic problems.” There’s no shortage of recovery ideas, he says, but these will go nowhere if they do not also advance the long-neglected, basic rights of Haitians.
Farmer describes efforts to respond to Haiti’s disastrous earthquake of January 2010, which killed hundreds of thousands, left 1.3 million homeless and much of the capital in ruins. Today, nearly a year later, the generous pledges of international aid have yet to materialize, says Farmer, and the peril has expanded to include a cholera outbreak. This picture is all the bleaker for the deaths of many of Farmer’s collaborators. The earthquake destroyed invaluable “human infrastructure”, says Farmer, including all the nursing students at Haiti’s one public nursing school.
Farmer has been working in Haiti for more than a decade, attempting to address not just malnutrition, HIV and tuberculosis, but larger issues such as Haitians’ lack of access to clean water, public education and healthcare. He would like to see international aid groups and foreign powers involved with Haiti recognize these issues in a meaningful way. Farmer’s long-standing strategy has been to engage Haiti’s public sector, or what remains after years of military and U.S. proxy rule, in the fight for these rights. He says, “There is always a role for the promotion of basic rights…The question is how to do this in the field, not just win an argument in seminar.”
The earthquake has profoundly deepened Haiti’s need for essential public institutions. The 1,000-plus tent cities housing more than a million people in Port-au-Prince are swelling, not diminishing, because people cannot find potable water anywhere else, and most have no idea where their next meal will come from. Yet there is a push to expel people from their tents and tarps, says Farmer, as if that will somehow speed construction of more permanent residences. Many plans are afoot for such housing, he says -- but few that take into account the desires of Haitians, who should have agency in shaping their own future. Rebuilding Haiti, Farmer believes, means “rebuilding aid machinery which is very broken, and often a damaging thing.” He is forging new alliances among Haitians and other aid partners, including Cubans and evangelical groups from the U.S., around water projects, and a new hospital that will be “big, green and public.” Says Farmer, “We must make common cause with those seeking to provide basic rights.”
Farmer describes efforts to respond to Haiti’s disastrous earthquake of January 2010, which killed hundreds of thousands, left 1.3 million homeless and much of the capital in ruins. Today, nearly a year later, the generous pledges of international aid have yet to materialize, says Farmer, and the peril has expanded to include a cholera outbreak. This picture is all the bleaker for the deaths of many of Farmer’s collaborators. The earthquake destroyed invaluable “human infrastructure”, says Farmer, including all the nursing students at Haiti’s one public nursing school.
Farmer has been working in Haiti for more than a decade, attempting to address not just malnutrition, HIV and tuberculosis, but larger issues such as Haitians’ lack of access to clean water, public education and healthcare. He would like to see international aid groups and foreign powers involved with Haiti recognize these issues in a meaningful way. Farmer’s long-standing strategy has been to engage Haiti’s public sector, or what remains after years of military and U.S. proxy rule, in the fight for these rights. He says, “There is always a role for the promotion of basic rights…The question is how to do this in the field, not just win an argument in seminar.”
The earthquake has profoundly deepened Haiti’s need for essential public institutions. The 1,000-plus tent cities housing more than a million people in Port-au-Prince are swelling, not diminishing, because people cannot find potable water anywhere else, and most have no idea where their next meal will come from. Yet there is a push to expel people from their tents and tarps, says Farmer, as if that will somehow speed construction of more permanent residences. Many plans are afoot for such housing, he says -- but few that take into account the desires of Haitians, who should have agency in shaping their own future. Rebuilding Haiti, Farmer believes, means “rebuilding aid machinery which is very broken, and often a damaging thing.” He is forging new alliances among Haitians and other aid partners, including Cubans and evangelical groups from the U.S., around water projects, and a new hospital that will be “big, green and public.” Says Farmer, “We must make common cause with those seeking to provide basic rights.”
Categories: TemeTV
A New Language for Mental Illness
Mental illness needs a “new narrative,” says
Jane Pauley. Just as cancer has moved from the shadows to pink ribbons and races for the cure, mental illness must shed its public aura of fear and shame. “Shrewd move; let’s do that,” says Pauley.
In a revealing and self-effacing talk, Pauley describes her own passage a decade ago from poster girl for NBC News to psychiatric patient. At 50, she was well aware of her reputation: “I could make no credible claim to being the best, hardest working, most beautiful in the industry. But honest, I owned normal. Or I thought I did.” So the “bombshell diagnosis” of bipolar disorder, brought on by steroid treatment for hives, and antidepressants, rocked her world.
It was a long struggle to crawl back from “the dark precipice of mental illness,” which included a period of hospitalization. And it did not help that her doctor was shocked that Pauley, who was writing an autobiography, wanted to discuss her condition in the book. In spite of such anguish and anxiety, Pauley says she “had hope” even from the beginning. Medicine helped, but Pauley also credits the capacity to open up about her situation with family and increasingly, in public forums. “When I’m heard talking comfortably about mental illness, as comfortably as talking about triple bypass surgery, I think I’m helping normalize mental illness. Normalizing is a much better word than destigmatizing. Change vocabulary, narrative; change minds, save lives,” she says.
Today Pauley sees a shift in how people regard mental illness, a new candor. Knowledge is the antidote to fear, she believes, and work “demystifying the brain is a step toward destigmatizing mental illness.” Her personal goal, she concludes, is to “banish ugly, out-of-date attitudes” and replace them with “new neural connections, positive associations. As they say, consciousness once raised cannot easily be lowered again.”
Jane Pauley. Just as cancer has moved from the shadows to pink ribbons and races for the cure, mental illness must shed its public aura of fear and shame. “Shrewd move; let’s do that,” says Pauley.
In a revealing and self-effacing talk, Pauley describes her own passage a decade ago from poster girl for NBC News to psychiatric patient. At 50, she was well aware of her reputation: “I could make no credible claim to being the best, hardest working, most beautiful in the industry. But honest, I owned normal. Or I thought I did.” So the “bombshell diagnosis” of bipolar disorder, brought on by steroid treatment for hives, and antidepressants, rocked her world.
It was a long struggle to crawl back from “the dark precipice of mental illness,” which included a period of hospitalization. And it did not help that her doctor was shocked that Pauley, who was writing an autobiography, wanted to discuss her condition in the book. In spite of such anguish and anxiety, Pauley says she “had hope” even from the beginning. Medicine helped, but Pauley also credits the capacity to open up about her situation with family and increasingly, in public forums. “When I’m heard talking comfortably about mental illness, as comfortably as talking about triple bypass surgery, I think I’m helping normalize mental illness. Normalizing is a much better word than destigmatizing. Change vocabulary, narrative; change minds, save lives,” she says.
Today Pauley sees a shift in how people regard mental illness, a new candor. Knowledge is the antidote to fear, she believes, and work “demystifying the brain is a step toward destigmatizing mental illness.” Her personal goal, she concludes, is to “banish ugly, out-of-date attitudes” and replace them with “new neural connections, positive associations. As they say, consciousness once raised cannot easily be lowered again.”
Categories: TemeTV
Open Payment, A New Approach to Public Transportation Fare Collection
Soon, a ticket to ride won’t require paper coupons, tokens, human vendors, or even Boston’s CharlieCard. Urban transit is abandoning a century old payment system for sophisticated digital payment technology, says George Kocur.
Kocur has been toiling for a decade on technology and methods that will enable transit industry operations to become more intelligent and efficient. He notes that many cities have already acquired an assortment of improvements -- smartphone apps, and GPS networks to keep transport on time, for instance. But these are often expensive proprietary services and products offered by a hodgepodge of vendors. Kocur makes the argument for developing non-proprietary systems, especially around fare payment, which could be utilized by multiple transit authorities, reducing costs over time.
He describes the evolution of a “generic e-collection technology framework,” based on a standardized, ’contactless’ payment card used in many stores. This card, bearing valid credit after an online or phone transaction, can serve riders as a monthly pass, or even a single trip ticket. It’s also very fast. In New York City tests, the e-collection card managed a transaction with a server via fiber optic network in 200-300 milliseconds on subway rides, and 400-800 milliseconds on that city’s buses (wireless data moves a tad slower). In contrast, Boston’s CharlieCard has a built-in chip that calculates the cost of the trip and debits it from the card, consuming valuable seconds.
Kocur is also working on a “fare engine” that maps “a set of card taps into a set of journey segments,” and groups these segments into trips, and trips into fares. Complex algorithms come into play, and the end result would permit riders real-time options on both journey-routing and fares. This software is flexible enough to work in London, New York and other cities, optimizing for each system’s travel network. To accommodate riders without bank credit, researchers are coming up with options including ATMs that accept cash to credit a fare card.
Kocur ultimately envisions piecing together “components that could be shared across transit systems,” perhaps even a single card accepted at transit agencies around the world. He hopes to demonstrate that “we no longer need something specific to each agency that’s expensive and difficult.” This would mean public transportation leaders talking to each other, as well as to banks and credit card companies. “It’s just about change in the transit industry, using technology as a lever,” he concludes.
Kocur has been toiling for a decade on technology and methods that will enable transit industry operations to become more intelligent and efficient. He notes that many cities have already acquired an assortment of improvements -- smartphone apps, and GPS networks to keep transport on time, for instance. But these are often expensive proprietary services and products offered by a hodgepodge of vendors. Kocur makes the argument for developing non-proprietary systems, especially around fare payment, which could be utilized by multiple transit authorities, reducing costs over time.
He describes the evolution of a “generic e-collection technology framework,” based on a standardized, ’contactless’ payment card used in many stores. This card, bearing valid credit after an online or phone transaction, can serve riders as a monthly pass, or even a single trip ticket. It’s also very fast. In New York City tests, the e-collection card managed a transaction with a server via fiber optic network in 200-300 milliseconds on subway rides, and 400-800 milliseconds on that city’s buses (wireless data moves a tad slower). In contrast, Boston’s CharlieCard has a built-in chip that calculates the cost of the trip and debits it from the card, consuming valuable seconds.
Kocur is also working on a “fare engine” that maps “a set of card taps into a set of journey segments,” and groups these segments into trips, and trips into fares. Complex algorithms come into play, and the end result would permit riders real-time options on both journey-routing and fares. This software is flexible enough to work in London, New York and other cities, optimizing for each system’s travel network. To accommodate riders without bank credit, researchers are coming up with options including ATMs that accept cash to credit a fare card.
Kocur ultimately envisions piecing together “components that could be shared across transit systems,” perhaps even a single card accepted at transit agencies around the world. He hopes to demonstrate that “we no longer need something specific to each agency that’s expensive and difficult.” This would mean public transportation leaders talking to each other, as well as to banks and credit card companies. “It’s just about change in the transit industry, using technology as a lever,” he concludes.
Categories: TemeTV
Open Payment, A New Approach to Public Transportation Fare Collection
Soon, a ticket to ride won’t require paper coupons, tokens, human vendors, or even Boston’s CharlieCard. Urban transit is abandoning a century old payment system for sophisticated digital payment technology, says George Kocur.
Kocur has been toiling for a decade on technology and methods that will enable transit industry operations to become more intelligent and efficient. He notes that many cities have already acquired an assortment of improvements -- smartphone apps, and GPS networks to keep transport on time, for instance. But these are often expensive proprietary services and products offered by a hodgepodge of vendors. Kocur makes the argument for developing non-proprietary systems, especially around fare payment, which could be utilized by multiple transit authorities, reducing costs over time.
He describes the evolution of a “generic e-collection technology framework,” based on a standardized, ’contactless’ payment card used in many stores. This card, bearing valid credit after an online or phone transaction, can serve riders as a monthly pass, or even a single trip ticket. It’s also very fast. In New York City tests, the e-collection card managed a transaction with a server via fiber optic network in 200-300 milliseconds on subway rides, and 400-800 milliseconds on that city’s buses (wireless data moves a tad slower). In contrast, Boston’s CharlieCard has a built-in chip that calculates the cost of the trip and debits it from the card, consuming valuable seconds.
Kocur is also working on a “fare engine” that maps “a set of card taps into a set of journey segments,” and groups these segments into trips, and trips into fares. Complex algorithms come into play, and the end result would permit riders real-time options on both journey-routing and fares. This software is flexible enough to work in London, New York and other cities, optimizing for each system’s travel network. To accommodate riders without bank credit, researchers are coming up with options including ATMs that accept cash to credit a fare card.
Kocur ultimately envisions piecing together “components that could be shared across transit systems,” perhaps even a single card accepted at transit agencies around the world. He hopes to demonstrate that “we no longer need something specific to each agency that’s expensive and difficult.” This would mean public transportation leaders talking to each other, as well as to banks and credit card companies. “It’s just about change in the transit industry, using technology as a lever,” he concludes.
Kocur has been toiling for a decade on technology and methods that will enable transit industry operations to become more intelligent and efficient. He notes that many cities have already acquired an assortment of improvements -- smartphone apps, and GPS networks to keep transport on time, for instance. But these are often expensive proprietary services and products offered by a hodgepodge of vendors. Kocur makes the argument for developing non-proprietary systems, especially around fare payment, which could be utilized by multiple transit authorities, reducing costs over time.
He describes the evolution of a “generic e-collection technology framework,” based on a standardized, ’contactless’ payment card used in many stores. This card, bearing valid credit after an online or phone transaction, can serve riders as a monthly pass, or even a single trip ticket. It’s also very fast. In New York City tests, the e-collection card managed a transaction with a server via fiber optic network in 200-300 milliseconds on subway rides, and 400-800 milliseconds on that city’s buses (wireless data moves a tad slower). In contrast, Boston’s CharlieCard has a built-in chip that calculates the cost of the trip and debits it from the card, consuming valuable seconds.
Kocur is also working on a “fare engine” that maps “a set of card taps into a set of journey segments,” and groups these segments into trips, and trips into fares. Complex algorithms come into play, and the end result would permit riders real-time options on both journey-routing and fares. This software is flexible enough to work in London, New York and other cities, optimizing for each system’s travel network. To accommodate riders without bank credit, researchers are coming up with options including ATMs that accept cash to credit a fare card.
Kocur ultimately envisions piecing together “components that could be shared across transit systems,” perhaps even a single card accepted at transit agencies around the world. He hopes to demonstrate that “we no longer need something specific to each agency that’s expensive and difficult.” This would mean public transportation leaders talking to each other, as well as to banks and credit card companies. “It’s just about change in the transit industry, using technology as a lever,” he concludes.
Categories: TemeTV
A New Language for Mental Illness
Mental illness needs a “new narrative,” says
Jane Pauley. Just as cancer has moved from the shadows to pink ribbons and races for the cure, mental illness must shed its public aura of fear and shame. “Shrewd move; let’s do that,” says Pauley.
In a revealing and self-effacing talk, Pauley describes her own passage a decade ago from poster girl for NBC News to psychiatric patient. At 50, she was well aware of her reputation: “I could make no credible claim to being the best, hardest working, most beautiful in the industry. But honest, I owned normal. Or I thought I did.” So the “bombshell diagnosis” of bipolar disorder, brought on by steroid treatment for hives, and antidepressants, rocked her world.
It was a long struggle to crawl back from “the dark precipice of mental illness,” which included a period of hospitalization. And it did not help that her doctor was shocked that Pauley, who was writing an autobiography, wanted to discuss her condition in the book. In spite of such anguish and anxiety, Pauley says she “had hope” even from the beginning. Medicine helped, but Pauley also credits the capacity to open up about her situation with family and increasingly, in public forums. “When I’m heard talking comfortably about mental illness, as comfortably as talking about triple bypass surgery, I think I’m helping normalize mental illness. Normalizing is a much better word than destigmatizing. Change vocabulary, narrative; change minds, save lives,” she says.
Today Pauley sees a shift in how people regard mental illness, a new candor. Knowledge is the antidote to fear, she believes, and work “demystifying the brain is a step toward destigmatizing mental illness.” Her personal goal, she concludes, is to “banish ugly, out-of-date attitudes” and replace them with “new neural connections, positive associations. As they say, consciousness once raised cannot easily be lowered again.”
Jane Pauley. Just as cancer has moved from the shadows to pink ribbons and races for the cure, mental illness must shed its public aura of fear and shame. “Shrewd move; let’s do that,” says Pauley.
In a revealing and self-effacing talk, Pauley describes her own passage a decade ago from poster girl for NBC News to psychiatric patient. At 50, she was well aware of her reputation: “I could make no credible claim to being the best, hardest working, most beautiful in the industry. But honest, I owned normal. Or I thought I did.” So the “bombshell diagnosis” of bipolar disorder, brought on by steroid treatment for hives, and antidepressants, rocked her world.
It was a long struggle to crawl back from “the dark precipice of mental illness,” which included a period of hospitalization. And it did not help that her doctor was shocked that Pauley, who was writing an autobiography, wanted to discuss her condition in the book. In spite of such anguish and anxiety, Pauley says she “had hope” even from the beginning. Medicine helped, but Pauley also credits the capacity to open up about her situation with family and increasingly, in public forums. “When I’m heard talking comfortably about mental illness, as comfortably as talking about triple bypass surgery, I think I’m helping normalize mental illness. Normalizing is a much better word than destigmatizing. Change vocabulary, narrative; change minds, save lives,” she says.
Today Pauley sees a shift in how people regard mental illness, a new candor. Knowledge is the antidote to fear, she believes, and work “demystifying the brain is a step toward destigmatizing mental illness.” Her personal goal, she concludes, is to “banish ugly, out-of-date attitudes” and replace them with “new neural connections, positive associations. As they say, consciousness once raised cannot easily be lowered again.”
Categories: TemeTV
The Art of Science Communication
You wouldn’t know that Alan Alda felt nervous in advance of addressing this audience of neuroscientists. In his trademark style, Alda chats up the crowd like an old friend, sharing anecdotes involving one of his great pursuits: “I love to talk to scientists,” he says.
When he is not on stage or in a film, Alda works to advance the public understanding of science. For more than a decade, he has served as a kind of super talent for Scientific American Frontiers on PBS, helping develop a unique kind of program. Meeting scientists around the world, Alda would pose a series of unscripted questions, the more naïve the better. “An amazing thing happened on their end: the real ‘them’ came out. They weren’t lecturing me, but connecting with me and trying to get me to understand. These conversation modes brought out not only their own personalities, but the science through their personalities.”
Whether climbing a forbidden stairway in the Leaning Tower of Pisa, or squatting at the rim of a crater on the suspiciously steaming Vesuvius volcano, Alda always managed to engage his scientist confederate in lively and instructive interactions. In this “wonderful system,” says Alda, the “scientist would warm up to me and the science would come out in a way that was understandable.” He relates a revelatory incident, where a scientist inadvertently turned away from him during taping and addressed the camera instead. Her tone became instantly dry and the information “unintelligible.” This episode “changed the course of my life,” says Alda, leading him to pursue his own research on how spontaneous social communication can simply vanish in certain circumstances. If scientists could readily summon the capacity for everyday, natural communication, Alda suggests, imagine how much more effective they might be.
He shows “before and after” videos of young engineers with whom he has worked on improvisation exercises. Post-Alda, they appear to express themselves with greater warmth. “Understanding and reading faces and speaking in a tone of voice that carries emotion and meaning above and beyond words” is critical, says Alda. He hopes that researchers at places like the McGovern Institute can help unravel the neurological basis for the kind of communication “that makes us human,” work that someday may help “scientists all over…to speak in their own voices.”
When he is not on stage or in a film, Alda works to advance the public understanding of science. For more than a decade, he has served as a kind of super talent for Scientific American Frontiers on PBS, helping develop a unique kind of program. Meeting scientists around the world, Alda would pose a series of unscripted questions, the more naïve the better. “An amazing thing happened on their end: the real ‘them’ came out. They weren’t lecturing me, but connecting with me and trying to get me to understand. These conversation modes brought out not only their own personalities, but the science through their personalities.”
Whether climbing a forbidden stairway in the Leaning Tower of Pisa, or squatting at the rim of a crater on the suspiciously steaming Vesuvius volcano, Alda always managed to engage his scientist confederate in lively and instructive interactions. In this “wonderful system,” says Alda, the “scientist would warm up to me and the science would come out in a way that was understandable.” He relates a revelatory incident, where a scientist inadvertently turned away from him during taping and addressed the camera instead. Her tone became instantly dry and the information “unintelligible.” This episode “changed the course of my life,” says Alda, leading him to pursue his own research on how spontaneous social communication can simply vanish in certain circumstances. If scientists could readily summon the capacity for everyday, natural communication, Alda suggests, imagine how much more effective they might be.
He shows “before and after” videos of young engineers with whom he has worked on improvisation exercises. Post-Alda, they appear to express themselves with greater warmth. “Understanding and reading faces and speaking in a tone of voice that carries emotion and meaning above and beyond words” is critical, says Alda. He hopes that researchers at places like the McGovern Institute can help unravel the neurological basis for the kind of communication “that makes us human,” work that someday may help “scientists all over…to speak in their own voices.”
Categories: TemeTV
The Art of Science Communication
You wouldn’t know that Alan Alda felt nervous in advance of addressing this audience of neuroscientists. In his trademark style, Alda chats up the crowd like an old friend, sharing anecdotes involving one of his great pursuits: “I love to talk to scientists,” he says.
When he is not on stage or in a film, Alda works to advance the public understanding of science. For more than a decade, he has served as a kind of super talent for Scientific American Frontiers on PBS, helping develop a unique kind of program. Meeting scientists around the world, Alda would pose a series of unscripted questions, the more naïve the better. “An amazing thing happened on their end: the real ‘them’ came out. They weren’t lecturing me, but connecting with me and trying to get me to understand. These conversation modes brought out not only their own personalities, but the science through their personalities.”
Whether climbing a forbidden stairway in the Leaning Tower of Pisa, or squatting at the rim of a crater on the suspiciously steaming Vesuvius volcano, Alda always managed to engage his scientist confederate in lively and instructive interactions. In this “wonderful system,” says Alda, the “scientist would warm up to me and the science would come out in a way that was understandable.” He relates a revelatory incident, where a scientist inadvertently turned away from him during taping and addressed the camera instead. Her tone became instantly dry and the information “unintelligible.” This episode “changed the course of my life,” says Alda, leading him to pursue his own research on how spontaneous social communication can simply vanish in certain circumstances. If scientists could readily summon the capacity for everyday, natural communication, Alda suggests, imagine how much more effective they might be.
He shows “before and after” videos of young engineers with whom he has worked on improvisation exercises. Post-Alda, they appear to express themselves with greater warmth. “Understanding and reading faces and speaking in a tone of voice that carries emotion and meaning above and beyond words” is critical, says Alda. He hopes that researchers at places like the McGovern Institute can help unravel the neurological basis for the kind of communication “that makes us human,” work that someday may help “scientists all over…to speak in their own voices.”
When he is not on stage or in a film, Alda works to advance the public understanding of science. For more than a decade, he has served as a kind of super talent for Scientific American Frontiers on PBS, helping develop a unique kind of program. Meeting scientists around the world, Alda would pose a series of unscripted questions, the more naïve the better. “An amazing thing happened on their end: the real ‘them’ came out. They weren’t lecturing me, but connecting with me and trying to get me to understand. These conversation modes brought out not only their own personalities, but the science through their personalities.”
Whether climbing a forbidden stairway in the Leaning Tower of Pisa, or squatting at the rim of a crater on the suspiciously steaming Vesuvius volcano, Alda always managed to engage his scientist confederate in lively and instructive interactions. In this “wonderful system,” says Alda, the “scientist would warm up to me and the science would come out in a way that was understandable.” He relates a revelatory incident, where a scientist inadvertently turned away from him during taping and addressed the camera instead. Her tone became instantly dry and the information “unintelligible.” This episode “changed the course of my life,” says Alda, leading him to pursue his own research on how spontaneous social communication can simply vanish in certain circumstances. If scientists could readily summon the capacity for everyday, natural communication, Alda suggests, imagine how much more effective they might be.
He shows “before and after” videos of young engineers with whom he has worked on improvisation exercises. Post-Alda, they appear to express themselves with greater warmth. “Understanding and reading faces and speaking in a tone of voice that carries emotion and meaning above and beyond words” is critical, says Alda. He hopes that researchers at places like the McGovern Institute can help unravel the neurological basis for the kind of communication “that makes us human,” work that someday may help “scientists all over…to speak in their own voices.”
Categories: TemeTV
The Energy/Climate-Change Challenge and the Role of Nuclear Energy in Meeting It
In a meaty lecture that serves as a concise and comprehensive primer on the twin challenge of energy and environment, John Holdren lays out the difficult options for contending with a world rapidly overheating.
“There is no question the world is growing hotter,” says Holdren, “and we do have a pretty good handle on … influences on climate that are changing the average temperature of the Earth,” he says. Since the mid-19th century, there has been a 20-fold increase in the world’s use of energy, the preponderance of which comes from burning fossil fuels. The U.S. is 82% dependent on these fuels, and the rest of the world is racing to catch up. If all nations continue business as usual, says Holdren, by 2030 energy use will increase by about 60% over 2005 levels, with fossil fuels comprising about 70% of world energy use. While there is legitimate concern about the economic, political and security risks of fossil fuel dependence, he says, CO2 and other greenhouse gas emissions that result from fossil fuel combustion pose an immense, immediate threat to the planet. From urban and regional air pollution to massive wildfires and fierce storms that bring coastal inundation, dramatic climate disruption is upon us and demands action now.
In order to avoid the biggest risks, such as a temperature increase of several degrees centigrade, we must “sharply change the ratio of energy used essentially immediately,” Holdren says. But it would cost around $15 trillion to convert the world’s fossil fuel dependent energy system into something less destructive, and this conversion would take too long, even if nations could agree on an alternative system. So we are confronted with striking a balance between mitigation and adaptation. Scientists think stabilizing CO2 emissions at 450 parts per million by 2030 might give humanity a shot at avoiding a planet with temperatures as high as those 30 million years ago (when crocodiles swam off Greenland and palm trees swayed in Wyoming).
Looking to cut CO2 emissions drastically, the Obama Administration is intent on achieving changes in vehicle fuel efficiency, promoting public transportation and other measures. But realistically, adaptation must also come into play, including changes in agricultural practices, engineering defenses against rising coastal waters, and warding off tropical diseases. The longer we wait, says Holdren, the more expensive mitigation and adaptation become.
The wrenching changes needed across the board to reach the ambitious goal of 450 ppm require “barrier-busting incentives,” and cannot be accomplished without eliminating “perverse incentives” that encourage business as usual. Holdren believes carbon pricing is essential and inevitable, despite the current climate in Washington. Nuclear power has a critical role to play in this transformation -- including the elusive goal of fusion reactors -- but it must be part of a larger surge in R&D spending on new energy technology ($15 billion versus the current $4 billion per year). The political will to meet this challenge remains a sticking point, and so scientists must do a better job explaining climate change to people, says Holdren. Since there is no silver bullet for the problem, he concludes, “we have got to do it all. If you look at the magnitude of the challenge and the amount by which we must reduce the ratio of greenhouse gas emissions to useful energy supplied to the economy, we can leave no stone unturned, and that’s what we’re trying to get done.”
“There is no question the world is growing hotter,” says Holdren, “and we do have a pretty good handle on … influences on climate that are changing the average temperature of the Earth,” he says. Since the mid-19th century, there has been a 20-fold increase in the world’s use of energy, the preponderance of which comes from burning fossil fuels. The U.S. is 82% dependent on these fuels, and the rest of the world is racing to catch up. If all nations continue business as usual, says Holdren, by 2030 energy use will increase by about 60% over 2005 levels, with fossil fuels comprising about 70% of world energy use. While there is legitimate concern about the economic, political and security risks of fossil fuel dependence, he says, CO2 and other greenhouse gas emissions that result from fossil fuel combustion pose an immense, immediate threat to the planet. From urban and regional air pollution to massive wildfires and fierce storms that bring coastal inundation, dramatic climate disruption is upon us and demands action now.
In order to avoid the biggest risks, such as a temperature increase of several degrees centigrade, we must “sharply change the ratio of energy used essentially immediately,” Holdren says. But it would cost around $15 trillion to convert the world’s fossil fuel dependent energy system into something less destructive, and this conversion would take too long, even if nations could agree on an alternative system. So we are confronted with striking a balance between mitigation and adaptation. Scientists think stabilizing CO2 emissions at 450 parts per million by 2030 might give humanity a shot at avoiding a planet with temperatures as high as those 30 million years ago (when crocodiles swam off Greenland and palm trees swayed in Wyoming).
Looking to cut CO2 emissions drastically, the Obama Administration is intent on achieving changes in vehicle fuel efficiency, promoting public transportation and other measures. But realistically, adaptation must also come into play, including changes in agricultural practices, engineering defenses against rising coastal waters, and warding off tropical diseases. The longer we wait, says Holdren, the more expensive mitigation and adaptation become.
The wrenching changes needed across the board to reach the ambitious goal of 450 ppm require “barrier-busting incentives,” and cannot be accomplished without eliminating “perverse incentives” that encourage business as usual. Holdren believes carbon pricing is essential and inevitable, despite the current climate in Washington. Nuclear power has a critical role to play in this transformation -- including the elusive goal of fusion reactors -- but it must be part of a larger surge in R&D spending on new energy technology ($15 billion versus the current $4 billion per year). The political will to meet this challenge remains a sticking point, and so scientists must do a better job explaining climate change to people, says Holdren. Since there is no silver bullet for the problem, he concludes, “we have got to do it all. If you look at the magnitude of the challenge and the amount by which we must reduce the ratio of greenhouse gas emissions to useful energy supplied to the economy, we can leave no stone unturned, and that’s what we’re trying to get done.”
Categories: TemeTV
The Energy/Climate-Change Challenge and the Role of Nuclear Energy in Meeting It
In a meaty lecture that serves as a concise and comprehensive primer on the twin challenge of energy and environment, John Holdren lays out the difficult options for contending with a world rapidly overheating.
“There is no question the world is growing hotter,” says Holdren, “and we do have a pretty good handle on … influences on climate that are changing the average temperature of the Earth,” he says. Since the mid-19th century, there has been a 20-fold increase in the world’s use of energy, the preponderance of which comes from burning fossil fuels. The U.S. is 82% dependent on these fuels, and the rest of the world is racing to catch up. If all nations continue business as usual, says Holdren, by 2030 energy use will increase by about 60% over 2005 levels, with fossil fuels comprising about 70% of world energy use. While there is legitimate concern about the economic, political and security risks of fossil fuel dependence, he says, CO2 and other greenhouse gas emissions that result from fossil fuel combustion pose an immense, immediate threat to the planet. From urban and regional air pollution to massive wildfires and fierce storms that bring coastal inundation, dramatic climate disruption is upon us and demands action now.
In order to avoid the biggest risks, such as a temperature increase of several degrees centigrade, we must “sharply change the ratio of energy used essentially immediately,” Holdren says. But it would cost around $15 trillion to convert the world’s fossil fuel dependent energy system into something less destructive, and this conversion would take too long, even if nations could agree on an alternative system. So we are confronted with striking a balance between mitigation and adaptation. Scientists think stabilizing CO2 emissions at 450 parts per million by 2030 might give humanity a shot at avoiding a planet with temperatures as high as those 30 million years ago (when crocodiles swam off Greenland and palm trees swayed in Wyoming).
Looking to cut CO2 emissions drastically, the Obama Administration is intent on achieving changes in vehicle fuel efficiency, promoting public transportation and other measures. But realistically, adaptation must also come into play, including changes in agricultural practices, engineering defenses against rising coastal waters, and warding off tropical diseases. The longer we wait, says Holdren, the more expensive mitigation and adaptation become.
The wrenching changes needed across the board to reach the ambitious goal of 450 ppm require “barrier-busting incentives,” and cannot be accomplished without eliminating “perverse incentives” that encourage business as usual. Holdren believes carbon pricing is essential and inevitable, despite the current climate in Washington. Nuclear power has a critical role to play in this transformation -- including the elusive goal of fusion reactors -- but it must be part of a larger surge in R&D spending on new energy technology ($15 billion versus the current $4 billion per year). The political will to meet this challenge remains a sticking point, and so scientists must do a better job explaining climate change to people, says Holdren. Since there is no silver bullet for the problem, he concludes, “we have got to do it all. If you look at the magnitude of the challenge and the amount by which we must reduce the ratio of greenhouse gas emissions to useful energy supplied to the economy, we can leave no stone unturned, and that’s what we’re trying to get done.”
“There is no question the world is growing hotter,” says Holdren, “and we do have a pretty good handle on … influences on climate that are changing the average temperature of the Earth,” he says. Since the mid-19th century, there has been a 20-fold increase in the world’s use of energy, the preponderance of which comes from burning fossil fuels. The U.S. is 82% dependent on these fuels, and the rest of the world is racing to catch up. If all nations continue business as usual, says Holdren, by 2030 energy use will increase by about 60% over 2005 levels, with fossil fuels comprising about 70% of world energy use. While there is legitimate concern about the economic, political and security risks of fossil fuel dependence, he says, CO2 and other greenhouse gas emissions that result from fossil fuel combustion pose an immense, immediate threat to the planet. From urban and regional air pollution to massive wildfires and fierce storms that bring coastal inundation, dramatic climate disruption is upon us and demands action now.
In order to avoid the biggest risks, such as a temperature increase of several degrees centigrade, we must “sharply change the ratio of energy used essentially immediately,” Holdren says. But it would cost around $15 trillion to convert the world’s fossil fuel dependent energy system into something less destructive, and this conversion would take too long, even if nations could agree on an alternative system. So we are confronted with striking a balance between mitigation and adaptation. Scientists think stabilizing CO2 emissions at 450 parts per million by 2030 might give humanity a shot at avoiding a planet with temperatures as high as those 30 million years ago (when crocodiles swam off Greenland and palm trees swayed in Wyoming).
Looking to cut CO2 emissions drastically, the Obama Administration is intent on achieving changes in vehicle fuel efficiency, promoting public transportation and other measures. But realistically, adaptation must also come into play, including changes in agricultural practices, engineering defenses against rising coastal waters, and warding off tropical diseases. The longer we wait, says Holdren, the more expensive mitigation and adaptation become.
The wrenching changes needed across the board to reach the ambitious goal of 450 ppm require “barrier-busting incentives,” and cannot be accomplished without eliminating “perverse incentives” that encourage business as usual. Holdren believes carbon pricing is essential and inevitable, despite the current climate in Washington. Nuclear power has a critical role to play in this transformation -- including the elusive goal of fusion reactors -- but it must be part of a larger surge in R&D spending on new energy technology ($15 billion versus the current $4 billion per year). The political will to meet this challenge remains a sticking point, and so scientists must do a better job explaining climate change to people, says Holdren. Since there is no silver bullet for the problem, he concludes, “we have got to do it all. If you look at the magnitude of the challenge and the amount by which we must reduce the ratio of greenhouse gas emissions to useful energy supplied to the economy, we can leave no stone unturned, and that’s what we’re trying to get done.”
Categories: TemeTV
Autism Research: Progress and Promises
“Imagine what it’s like to go through life without understanding what people you are with are thinking,” poses Gerald Fischbach. “You have no way of gauging whether they are angry, sad or happy.” At the core of the group of disorders known as autism, says Fischbach, is damaged social cognition, a kind of prison of the mind. First defined in 1943, autism has not readily yielded its secrets to scientists, but in the past decade, says Fischbach, there has been “remarkable progress” in working out the disorder’s likely causes and mechanisms.
As many as one in 100 people are now said to live with autism, up from one in 1000 a few years ago, but Fischbach believes the increasing numbers are more likely due to broadening public awareness and continually expanding definitions of the disorder, rather than an “epidemic.” Research on this pervasive problem proceeds on several fronts: genetic risk factors, molecular mechanisms, and neural circuits, cognition and behavior. Fischbach notes a plethora of genetic approaches to autism but says, “We researchers feel we are on to something” focusing on a type of genetic change called a copy number variant.
Ordinarily, individuals inherit a gene from each parent, but sometimes this process goes awry, leading to variances in the number of copies of genes. Studies show that deletions in copy numbers that occur in a certain region of DNA correspond to a “big risk factor” for autism. But these clues are just the start, says Fischbach. Now researchers must begin “figuring out precisely which gene is at fault, and what it is doing in the nervous system.”
Fischbach’s Simons Foundation is assembling a research pool of families with autistic members to serve as a long-term resource for scientists investigating not just copy number variants, but also other disorders with autistic features, including Rett syndrome and Fragile X syndrome. McGovern Institute research is revealing the central role of the synapse in these disorders, and imaging work is helping to point out regions of the brain central to the performance of social tasks and possibly to autistic behaviors.
Fischbach hopes in the next decade science will figure out not just gene factors, but the neural circuitry at play in autism. Says Fischbach, “In the end, we need to develop theories and models to account for the link between genes and behavior … It’s not enough to say autism is a disorder of synapses, or of connections. Of course it is. We need more specific hypotheses about autism and how it relates to social behavior.”
As many as one in 100 people are now said to live with autism, up from one in 1000 a few years ago, but Fischbach believes the increasing numbers are more likely due to broadening public awareness and continually expanding definitions of the disorder, rather than an “epidemic.” Research on this pervasive problem proceeds on several fronts: genetic risk factors, molecular mechanisms, and neural circuits, cognition and behavior. Fischbach notes a plethora of genetic approaches to autism but says, “We researchers feel we are on to something” focusing on a type of genetic change called a copy number variant.
Ordinarily, individuals inherit a gene from each parent, but sometimes this process goes awry, leading to variances in the number of copies of genes. Studies show that deletions in copy numbers that occur in a certain region of DNA correspond to a “big risk factor” for autism. But these clues are just the start, says Fischbach. Now researchers must begin “figuring out precisely which gene is at fault, and what it is doing in the nervous system.”
Fischbach’s Simons Foundation is assembling a research pool of families with autistic members to serve as a long-term resource for scientists investigating not just copy number variants, but also other disorders with autistic features, including Rett syndrome and Fragile X syndrome. McGovern Institute research is revealing the central role of the synapse in these disorders, and imaging work is helping to point out regions of the brain central to the performance of social tasks and possibly to autistic behaviors.
Fischbach hopes in the next decade science will figure out not just gene factors, but the neural circuitry at play in autism. Says Fischbach, “In the end, we need to develop theories and models to account for the link between genes and behavior … It’s not enough to say autism is a disorder of synapses, or of connections. Of course it is. We need more specific hypotheses about autism and how it relates to social behavior.”
Categories: TemeTV
Autism Research: Progress and Promises
“Imagine what it’s like to go through life without understanding what people you are with are thinking,” poses Gerald Fischbach. “You have no way of gauging whether they are angry, sad or happy.” At the core of the group of disorders known as autism, says Fischbach, is damaged social cognition, a kind of prison of the mind. First defined in 1943, autism has not readily yielded its secrets to scientists, but in the past decade, says Fischbach, there has been “remarkable progress” in working out the disorder’s likely causes and mechanisms.
As many as one in 100 people are now said to live with autism, up from one in 1000 a few years ago, but Fischbach believes the increasing numbers are more likely due to broadening public awareness and continually expanding definitions of the disorder, rather than an “epidemic.” Research on this pervasive problem proceeds on several fronts: genetic risk factors, molecular mechanisms, and neural circuits, cognition and behavior. Fischbach notes a plethora of genetic approaches to autism but says, “We researchers feel we are on to something” focusing on a type of genetic change called a copy number variant.
Ordinarily, individuals inherit a gene from each parent, but sometimes this process goes awry, leading to variances in the number of copies of genes. Studies show that deletions in copy numbers that occur in a certain region of DNA correspond to a “big risk factor” for autism. But these clues are just the start, says Fischbach. Now researchers must begin “figuring out precisely which gene is at fault, and what it is doing in the nervous system.”
Fischbach’s Simons Foundation is assembling a research pool of families with autistic members to serve as a long-term resource for scientists investigating not just copy number variants, but also other disorders with autistic features, including Rett syndrome and Fragile X syndrome. McGovern Institute research is revealing the central role of the synapse in these disorders, and imaging work is helping to point out regions of the brain central to the performance of social tasks and possibly to autistic behaviors.
Fischbach hopes in the next decade science will figure out not just gene factors, but the neural circuitry at play in autism. Says Fischbach, “In the end, we need to develop theories and models to account for the link between genes and behavior … It’s not enough to say autism is a disorder of synapses, or of connections. Of course it is. We need more specific hypotheses about autism and how it relates to social behavior.”
As many as one in 100 people are now said to live with autism, up from one in 1000 a few years ago, but Fischbach believes the increasing numbers are more likely due to broadening public awareness and continually expanding definitions of the disorder, rather than an “epidemic.” Research on this pervasive problem proceeds on several fronts: genetic risk factors, molecular mechanisms, and neural circuits, cognition and behavior. Fischbach notes a plethora of genetic approaches to autism but says, “We researchers feel we are on to something” focusing on a type of genetic change called a copy number variant.
Ordinarily, individuals inherit a gene from each parent, but sometimes this process goes awry, leading to variances in the number of copies of genes. Studies show that deletions in copy numbers that occur in a certain region of DNA correspond to a “big risk factor” for autism. But these clues are just the start, says Fischbach. Now researchers must begin “figuring out precisely which gene is at fault, and what it is doing in the nervous system.”
Fischbach’s Simons Foundation is assembling a research pool of families with autistic members to serve as a long-term resource for scientists investigating not just copy number variants, but also other disorders with autistic features, including Rett syndrome and Fragile X syndrome. McGovern Institute research is revealing the central role of the synapse in these disorders, and imaging work is helping to point out regions of the brain central to the performance of social tasks and possibly to autistic behaviors.
Fischbach hopes in the next decade science will figure out not just gene factors, but the neural circuitry at play in autism. Says Fischbach, “In the end, we need to develop theories and models to account for the link between genes and behavior … It’s not enough to say autism is a disorder of synapses, or of connections. Of course it is. We need more specific hypotheses about autism and how it relates to social behavior.”
Categories: TemeTV
Online Migration of Newspapers
Two seasoned media observers map out shifting terrain in the news industry, as digital forces shake up print journalism. They also predict some likely survivors and casualties of this upheaval.
David Carr now sees a porous border, if not a great deal of overlap, between once-segregated domains of traditional and online journalism. “Whether you’re looking at it on iPad, or enabled TV, or paper, there won’t be old media or new media, there will just be media.” As a New York Times media columnist, Carr both analyzes and participates in emerging hybrid news platforms. He has a quarter million followers on Twitter (“If my last name weren’t ‘NYT,’ it would be about 250”), and says one “can get hooked on that.”
The Times has seized on new technology to forge a path back to profitability. Says Carr, “We have a skunk works upstairs where propeller heads and mad scientists do who knows what…We get help from the tech heads to make things work better, and create more audience participation.” These innovations make reading The Times online a nearly endless experience --“You feel like you’re down a hobbit hole,” says Carr. Special online features and editions may be subject to “convenience charges” in the future, yielding new sources of revenue to help replace lost advertising dollars.
While he celebrates the proliferation of news websites, Carr has few kind words for online news aggregators such as Huffington Post, which he views as commoditizing content stolen from newspapers like The Times, leading to further decimation of old guard publications. He also frets about the transfer of audience loyalty from newspapers to blogs and Twitter. “The dispersal of authority is a threat over the long term for newspapers,” especially if pay walls go up around newspaper content. He worries about information becoming ghettoized. “I don’t want great journalism to be a high-class district where everyone isn’t invited,” says Carr.
“The Boston Globe, Philadelphia Inquirer, papers in that weight class are the most threatened,” says
Dan Kennedy. “They’re struggling to make themselves essential” doing just regional stories, since they have neither the staff nor the rationale for covering international or even national news. Kennedy worries that these papers may not be able to make a case for themselves with readers, given such a narrow mission. But there is some consolation: In some cities where such papers have already disappeared, or are in retreat, new forms of journalism are emerging. Kennedy has been studying New Haven, where the old city newspaper has fled to the suburbs chasing ad revenue, and a nonprofit community website, the New Haven Independent, has risen to cover the inner city. Funded by foundations and contributions, this tiny newsroom of four fulltime reporters on bikes “covers anything that moves in the neighborhoods of New Haven,” using a blog format with picture stories and video, and getting the word out with Twitter and Facebook. “People kill themselves doing reporting,” says Kennedy.
Non-profit, and for-profit models of community journalism such as Patch.com and Wicked Local, are popping up everywhere in vacuums left by shrinking newspapers. But the financial viability of these small enterprises is uncertain, and Kennedy acknowledges advertising money will never again primarily power news operations, in any medium. “If we are going to preserve journalism, professional journalism -- and it’s not 100% clear there’s a huge desire to do that -- we must move toward a model in which the user pays for much larger share of content,” Kennedy says.
David Carr now sees a porous border, if not a great deal of overlap, between once-segregated domains of traditional and online journalism. “Whether you’re looking at it on iPad, or enabled TV, or paper, there won’t be old media or new media, there will just be media.” As a New York Times media columnist, Carr both analyzes and participates in emerging hybrid news platforms. He has a quarter million followers on Twitter (“If my last name weren’t ‘NYT,’ it would be about 250”), and says one “can get hooked on that.”
The Times has seized on new technology to forge a path back to profitability. Says Carr, “We have a skunk works upstairs where propeller heads and mad scientists do who knows what…We get help from the tech heads to make things work better, and create more audience participation.” These innovations make reading The Times online a nearly endless experience --“You feel like you’re down a hobbit hole,” says Carr. Special online features and editions may be subject to “convenience charges” in the future, yielding new sources of revenue to help replace lost advertising dollars.
While he celebrates the proliferation of news websites, Carr has few kind words for online news aggregators such as Huffington Post, which he views as commoditizing content stolen from newspapers like The Times, leading to further decimation of old guard publications. He also frets about the transfer of audience loyalty from newspapers to blogs and Twitter. “The dispersal of authority is a threat over the long term for newspapers,” especially if pay walls go up around newspaper content. He worries about information becoming ghettoized. “I don’t want great journalism to be a high-class district where everyone isn’t invited,” says Carr.
“The Boston Globe, Philadelphia Inquirer, papers in that weight class are the most threatened,” says
Dan Kennedy. “They’re struggling to make themselves essential” doing just regional stories, since they have neither the staff nor the rationale for covering international or even national news. Kennedy worries that these papers may not be able to make a case for themselves with readers, given such a narrow mission. But there is some consolation: In some cities where such papers have already disappeared, or are in retreat, new forms of journalism are emerging. Kennedy has been studying New Haven, where the old city newspaper has fled to the suburbs chasing ad revenue, and a nonprofit community website, the New Haven Independent, has risen to cover the inner city. Funded by foundations and contributions, this tiny newsroom of four fulltime reporters on bikes “covers anything that moves in the neighborhoods of New Haven,” using a blog format with picture stories and video, and getting the word out with Twitter and Facebook. “People kill themselves doing reporting,” says Kennedy.
Non-profit, and for-profit models of community journalism such as Patch.com and Wicked Local, are popping up everywhere in vacuums left by shrinking newspapers. But the financial viability of these small enterprises is uncertain, and Kennedy acknowledges advertising money will never again primarily power news operations, in any medium. “If we are going to preserve journalism, professional journalism -- and it’s not 100% clear there’s a huge desire to do that -- we must move toward a model in which the user pays for much larger share of content,” Kennedy says.
Categories: TemeTV
Online Migration of Newspapers
Two seasoned media observers map out shifting terrain in the news industry, as digital forces shake up print journalism. They also predict some likely survivors and casualties of this upheaval.
David Carr now sees a porous border, if not a great deal of overlap, between once-segregated domains of traditional and online journalism. “Whether you’re looking at it on iPad, or enabled TV, or paper, there won’t be old media or new media, there will just be media.” As a New York Times media columnist, Carr both analyzes and participates in emerging hybrid news platforms. He has a quarter million followers on Twitter (“If my last name weren’t ‘NYT,’ it would be about 250”), and says one “can get hooked on that.”
The Times has seized on new technology to forge a path back to profitability. Says Carr, “We have a skunk works upstairs where propeller heads and mad scientists do who knows what…We get help from the tech heads to make things work better, and create more audience participation.” These innovations make reading The Times online a nearly endless experience --“You feel like you’re down a hobbit hole,” says Carr. Special online features and editions may be subject to “convenience charges” in the future, yielding new sources of revenue to help replace lost advertising dollars.
While he celebrates the proliferation of news websites, Carr has few kind words for online news aggregators such as Huffington Post, which he views as commoditizing content stolen from newspapers like The Times, leading to further decimation of old guard publications. He also frets about the transfer of audience loyalty from newspapers to blogs and Twitter. “The dispersal of authority is a threat over the long term for newspapers,” especially if pay walls go up around newspaper content. He worries about information becoming ghettoized. “I don’t want great journalism to be a high-class district where everyone isn’t invited,” says Carr.
“The Boston Globe, Philadelphia Inquirer, papers in that weight class are the most threatened,” says
Dan Kennedy. “They’re struggling to make themselves essential” doing just regional stories, since they have neither the staff nor the rationale for covering international or even national news. Kennedy worries that these papers may not be able to make a case for themselves with readers, given such a narrow mission. But there is some consolation: In some cities where such papers have already disappeared, or are in retreat, new forms of journalism are emerging. Kennedy has been studying New Haven, where the old city newspaper has fled to the suburbs chasing ad revenue, and a nonprofit community website, the New Haven Independent, has risen to cover the inner city. Funded by foundations and contributions, this tiny newsroom of four fulltime reporters on bikes “covers anything that moves in the neighborhoods of New Haven,” using a blog format with picture stories and video, and getting the word out with Twitter and Facebook. “People kill themselves doing reporting,” says Kennedy.
Non-profit, and for-profit models of community journalism such as Patch.com and Wicked Local, are popping up everywhere in vacuums left by shrinking newspapers. But the financial viability of these small enterprises is uncertain, and Kennedy acknowledges advertising money will never again primarily power news operations, in any medium. “If we are going to preserve journalism, professional journalism -- and it’s not 100% clear there’s a huge desire to do that -- we must move toward a model in which the user pays for much larger share of content,” Kennedy says.
David Carr now sees a porous border, if not a great deal of overlap, between once-segregated domains of traditional and online journalism. “Whether you’re looking at it on iPad, or enabled TV, or paper, there won’t be old media or new media, there will just be media.” As a New York Times media columnist, Carr both analyzes and participates in emerging hybrid news platforms. He has a quarter million followers on Twitter (“If my last name weren’t ‘NYT,’ it would be about 250”), and says one “can get hooked on that.”
The Times has seized on new technology to forge a path back to profitability. Says Carr, “We have a skunk works upstairs where propeller heads and mad scientists do who knows what…We get help from the tech heads to make things work better, and create more audience participation.” These innovations make reading The Times online a nearly endless experience --“You feel like you’re down a hobbit hole,” says Carr. Special online features and editions may be subject to “convenience charges” in the future, yielding new sources of revenue to help replace lost advertising dollars.
While he celebrates the proliferation of news websites, Carr has few kind words for online news aggregators such as Huffington Post, which he views as commoditizing content stolen from newspapers like The Times, leading to further decimation of old guard publications. He also frets about the transfer of audience loyalty from newspapers to blogs and Twitter. “The dispersal of authority is a threat over the long term for newspapers,” especially if pay walls go up around newspaper content. He worries about information becoming ghettoized. “I don’t want great journalism to be a high-class district where everyone isn’t invited,” says Carr.
“The Boston Globe, Philadelphia Inquirer, papers in that weight class are the most threatened,” says
Dan Kennedy. “They’re struggling to make themselves essential” doing just regional stories, since they have neither the staff nor the rationale for covering international or even national news. Kennedy worries that these papers may not be able to make a case for themselves with readers, given such a narrow mission. But there is some consolation: In some cities where such papers have already disappeared, or are in retreat, new forms of journalism are emerging. Kennedy has been studying New Haven, where the old city newspaper has fled to the suburbs chasing ad revenue, and a nonprofit community website, the New Haven Independent, has risen to cover the inner city. Funded by foundations and contributions, this tiny newsroom of four fulltime reporters on bikes “covers anything that moves in the neighborhoods of New Haven,” using a blog format with picture stories and video, and getting the word out with Twitter and Facebook. “People kill themselves doing reporting,” says Kennedy.
Non-profit, and for-profit models of community journalism such as Patch.com and Wicked Local, are popping up everywhere in vacuums left by shrinking newspapers. But the financial viability of these small enterprises is uncertain, and Kennedy acknowledges advertising money will never again primarily power news operations, in any medium. “If we are going to preserve journalism, professional journalism -- and it’s not 100% clear there’s a huge desire to do that -- we must move toward a model in which the user pays for much larger share of content,” Kennedy says.
Categories: TemeTV
McGovern Institute: Ten Years of Understanding the Brain in Health and Disease
Psychiatric illness and neurological disorders such as autism, depression, and Alzheimer’s disease cause countless families to suffer, and require prodigious economic resources to manage. Now, thanks to major advances in genomics, systems neuroscience, and human brain imaging, says Robert Desimone, scientists are unlocking key secrets in how the human brain functions, work that may herald new and more effective therapies for neural disorders.
In his keynote address, Desimone pays tribute to McGovern Institute researchers who are tackling a common problem: understanding the neural circuit.
Ed Boyden works with different wavelengths of light to turn targeted cells on and off in living brains, “much the way a conductor controls musicians in an orchestra,” says Desimone. Boyden has focused in particular on the “straightforward circuit” of the retina, replacing dead photoreceptors with genetically manipulated, light-sensitive molecules so that mice with impaired vision see light again. Someday, this research could help people with similar kinds of blindness.
McGovern researchers are also untangling the more complex neural circuitry associated with psychiatric diseases and developmental disorders. Michale Fee’s model of the neural basis for bird song identified a brain structure that has an exact parallel in mammals -- a loop connecting the cortex and basal ganglia in which motor sequences move through a chain of neurons in precise order, “like dominoes falling.” A mistake in this circuit in humans could result in behavioral disorders.
Guoping Feng demonstrates that a single malfunctioning synaptic protein can wreak havoc on the basal ganglia, disrupting learning in humans. He has also determined that related circuits bearing gene mutations create behavior in mice that remarkably mirrors obsessive compulsive disorders in humans.
Yingxi Lin has identified a gene that helps the brain regulate the excitatory and inhibitory synapses, keeping neurons in balance, the way a thermostat regulates temperature in a room. Without this gene, mice “get too much excitation” and develop seizure disorders. She has discovered a comparable gene in autistic people, who also are prone to seizures. Other McGovern researchers are developing next generation diagnostic tools.
John Gabrieli has mapped out the circuits central to high level cognitive functions, and will soon be deploying a new kind of imaging that gives a precise picture of dynamic changes in brain states, measured in milliseconds. And Alan Jasanoff uses genetic engineering techniques to create new molecules that act as sensors, showing the release and flow of chemicals in the brain that can highlight both healthy and diseased circuitry. Insights from McGovern research, says Desimone, “will lay down the foundation for therapeutics of the future.”
In his keynote address, Desimone pays tribute to McGovern Institute researchers who are tackling a common problem: understanding the neural circuit.
Ed Boyden works with different wavelengths of light to turn targeted cells on and off in living brains, “much the way a conductor controls musicians in an orchestra,” says Desimone. Boyden has focused in particular on the “straightforward circuit” of the retina, replacing dead photoreceptors with genetically manipulated, light-sensitive molecules so that mice with impaired vision see light again. Someday, this research could help people with similar kinds of blindness.
McGovern researchers are also untangling the more complex neural circuitry associated with psychiatric diseases and developmental disorders. Michale Fee’s model of the neural basis for bird song identified a brain structure that has an exact parallel in mammals -- a loop connecting the cortex and basal ganglia in which motor sequences move through a chain of neurons in precise order, “like dominoes falling.” A mistake in this circuit in humans could result in behavioral disorders.
Guoping Feng demonstrates that a single malfunctioning synaptic protein can wreak havoc on the basal ganglia, disrupting learning in humans. He has also determined that related circuits bearing gene mutations create behavior in mice that remarkably mirrors obsessive compulsive disorders in humans.
Yingxi Lin has identified a gene that helps the brain regulate the excitatory and inhibitory synapses, keeping neurons in balance, the way a thermostat regulates temperature in a room. Without this gene, mice “get too much excitation” and develop seizure disorders. She has discovered a comparable gene in autistic people, who also are prone to seizures. Other McGovern researchers are developing next generation diagnostic tools.
John Gabrieli has mapped out the circuits central to high level cognitive functions, and will soon be deploying a new kind of imaging that gives a precise picture of dynamic changes in brain states, measured in milliseconds. And Alan Jasanoff uses genetic engineering techniques to create new molecules that act as sensors, showing the release and flow of chemicals in the brain that can highlight both healthy and diseased circuitry. Insights from McGovern research, says Desimone, “will lay down the foundation for therapeutics of the future.”
Categories: TemeTV