- Want to solve a complex problem? Applied math can help
- Inadequate compensation for lost or downgraded protected areas threatens global biodiversity: Study
- Only 5 women have won the Nobel Prize in physics—recent winners share advice for young women in the field
- Madagascar's mining rush has caused no more deforestation than farming, study finds
- Scientists explore microbial diversity in sourdough starters
Feeds
Mixing Oil and Ecosystems
“An oil spill is a crime scene,” says Christopher Reddy, but quite unlike the kind in TV whodunits, where fictional forensic whizzes help nail down perpetrators with an arsenal of lab tools. For Reddy, a chemist involved in analyzing oil spills, investigations take years, and do not always yield certain results.
Reddy delivers a colorful account of his research, which includes an insider’s perspective on the Deepwater Horizon spill. He confesses that not long ago he “was thinking about getting out of the oil spill business;” the incidence of big accidents “had dropped like a rock” since 1991. Then came news of the BP well blowout. He was invited on the scene to take water samples in the spring of 2010. Reddy shows video from underwater robots collecting oil from the leaking well head, and of the fierce flames from gas burning off nearby. “You couldn’t hear anything, and you could feel the heat on your skin. I’ll never forget it,” Reddy recalls.
Reddy has long experience with tracking oil in the ocean and in the diverse coastal ecosystems where it comes ashore. He has learned that even 30 years after a spill, coastal marshes and shores that appear healthy often conceal toxic sludge that wreaks havoc on flora and fauna. Contrary to oil industry claims, sites don’t rebound easily.
Accounting for the Deepwater Horizon crude (nearly 200 million gallons) and its impact on the ocean and coastal environments has meant taking countless samples, and tagging them chemically. Oil is made of thousands of compounds, “each with a different personality, or behavior, like a teenager,” says Reddy, and nature treats these diverse oils in different ways: “Some evaporate, some biodegrade, or break down with sunlight.” Reddy says, “I want to know who’s (in deep water now), who used to be, and why the other guy is on the surface.” This means “punching holes in the water collecting as many data points as possible.”
The result of this work, involving hundreds of surveys by Reddy and other scientists, has costly legal ramifications for BP and the government, not to mention significant consequences for ecosystems and people living along the Gulf. And the outcome of this research will unfold not over months, but likely over decades, with lingering uncertainties about the ultimate disposition of the oil. “If we can say … about 50% evaporated, about 1/3rd biodegraded and we don’t know where the rest went,” says Reddy, “that might be the best we can get.”
Reddy delivers a colorful account of his research, which includes an insider’s perspective on the Deepwater Horizon spill. He confesses that not long ago he “was thinking about getting out of the oil spill business;” the incidence of big accidents “had dropped like a rock” since 1991. Then came news of the BP well blowout. He was invited on the scene to take water samples in the spring of 2010. Reddy shows video from underwater robots collecting oil from the leaking well head, and of the fierce flames from gas burning off nearby. “You couldn’t hear anything, and you could feel the heat on your skin. I’ll never forget it,” Reddy recalls.
Reddy has long experience with tracking oil in the ocean and in the diverse coastal ecosystems where it comes ashore. He has learned that even 30 years after a spill, coastal marshes and shores that appear healthy often conceal toxic sludge that wreaks havoc on flora and fauna. Contrary to oil industry claims, sites don’t rebound easily.
Accounting for the Deepwater Horizon crude (nearly 200 million gallons) and its impact on the ocean and coastal environments has meant taking countless samples, and tagging them chemically. Oil is made of thousands of compounds, “each with a different personality, or behavior, like a teenager,” says Reddy, and nature treats these diverse oils in different ways: “Some evaporate, some biodegrade, or break down with sunlight.” Reddy says, “I want to know who’s (in deep water now), who used to be, and why the other guy is on the surface.” This means “punching holes in the water collecting as many data points as possible.”
The result of this work, involving hundreds of surveys by Reddy and other scientists, has costly legal ramifications for BP and the government, not to mention significant consequences for ecosystems and people living along the Gulf. And the outcome of this research will unfold not over months, but likely over decades, with lingering uncertainties about the ultimate disposition of the oil. “If we can say … about 50% evaporated, about 1/3rd biodegraded and we don’t know where the rest went,” says Reddy, “that might be the best we can get.”
Categories: TemeTV
The Financial Crisis, the Recession, and the American Political Economy: A Systemic Perspective
Charles Ferguson shows how useful a varied background in math, political science and business can be, as he dissects the complexities and recent crisis of the U.S. financial system. In a lecture that distills many of the arguments of his recent film, Inside Job, Ferguson conveys dispassionately yet persuasively the reasons we all should feel profound anxiety not only about the nation’s financial institutions, but about our economic and political future as well.
Ferguson details the “securitization food chain,” a system of investing (and gambling) with debt that U.S. financial institutions enthusiastically adopted around 15 years ago. Encouraged by friendly government policies, a handful of investment behemoths such as JP Morgan and Lehman Brothers began transforming the banking landscape, buying up mortgages and other forms of debt worth countless billions of dollars, and packaging these securities for buyers worldwide. Allied financial institutions became adept at selling cheap mortgages to ordinary people, creating an inflated housing market. Insurance and ratings companies bought in. The speed of growth and scale of this securities chain was unprecedented, recounts Ferguson -- as was its impact on the nation’s economy, both at the market’s peak, and after its collapse.
Ferguson provides a very detailed and pointed sidebar on industry incentives that underlay the wild growth years. These included allowing investment banks to bet on the failure of their own securities; and linking rating agencies’ income to their approval of risky securities. Individuals inside big institutions made out like bandits, because they could. Senior executives in places like Bear Stearns took out over $1 billion in cash each in the years prior to the 2008 collapse. The head of Countrywide Mortgage saw the end coming, and cashed out over $100 million in stock. Asks Ferguson, “Why was such extreme behavior permitted? I have to conclude there was a complete abdication on the part of the regulatory system.”
Ferguson finds galling both government apathy in regulating and in prosecuting high-end white collar crime, but perceives the reason: a financial services industry that “as it rapidly consolidated and concentrated became the dominant source not only of corporate profits but campaign contributions and political funding in the U.S.” Evidence for unrestrained financial power lies in the fact that the government response to the crisis has been engineered by Wall Street insiders intent on shoring up firms too big to fail. Ferguson cites as well “corruption of the economics discipline,” the rising role of money in politics, and the increasing concentration of wealth in the hands of a few.
The dominance of a single industry constitutes a deep change and danger for America, believes Ferguson. The nation “has evolved a political duopoly where two political parties agree on things related to finance and money.” Without a political structure immune to such influence, Ferguson sees little likelihood of challenging the interests of the financial giants.
Ferguson details the “securitization food chain,” a system of investing (and gambling) with debt that U.S. financial institutions enthusiastically adopted around 15 years ago. Encouraged by friendly government policies, a handful of investment behemoths such as JP Morgan and Lehman Brothers began transforming the banking landscape, buying up mortgages and other forms of debt worth countless billions of dollars, and packaging these securities for buyers worldwide. Allied financial institutions became adept at selling cheap mortgages to ordinary people, creating an inflated housing market. Insurance and ratings companies bought in. The speed of growth and scale of this securities chain was unprecedented, recounts Ferguson -- as was its impact on the nation’s economy, both at the market’s peak, and after its collapse.
Ferguson provides a very detailed and pointed sidebar on industry incentives that underlay the wild growth years. These included allowing investment banks to bet on the failure of their own securities; and linking rating agencies’ income to their approval of risky securities. Individuals inside big institutions made out like bandits, because they could. Senior executives in places like Bear Stearns took out over $1 billion in cash each in the years prior to the 2008 collapse. The head of Countrywide Mortgage saw the end coming, and cashed out over $100 million in stock. Asks Ferguson, “Why was such extreme behavior permitted? I have to conclude there was a complete abdication on the part of the regulatory system.”
Ferguson finds galling both government apathy in regulating and in prosecuting high-end white collar crime, but perceives the reason: a financial services industry that “as it rapidly consolidated and concentrated became the dominant source not only of corporate profits but campaign contributions and political funding in the U.S.” Evidence for unrestrained financial power lies in the fact that the government response to the crisis has been engineered by Wall Street insiders intent on shoring up firms too big to fail. Ferguson cites as well “corruption of the economics discipline,” the rising role of money in politics, and the increasing concentration of wealth in the hands of a few.
The dominance of a single industry constitutes a deep change and danger for America, believes Ferguson. The nation “has evolved a political duopoly where two political parties agree on things related to finance and money.” Without a political structure immune to such influence, Ferguson sees little likelihood of challenging the interests of the financial giants.
Categories: TemeTV
The Financial Crisis, the Recession, and the American Political Economy: A Systemic Perspective
Charles Ferguson shows how useful a varied background in math, political science and business can be, as he dissects the complexities and recent crisis of the U.S. financial system. In a lecture that distills many of the arguments of his recent film, Inside Job, Ferguson conveys dispassionately yet persuasively the reasons we all should feel profound anxiety not only about the nation’s financial institutions, but about our economic and political future as well.
Ferguson details the “securitization food chain,” a system of investing (and gambling) with debt that U.S. financial institutions enthusiastically adopted around 15 years ago. Encouraged by friendly government policies, a handful of investment behemoths such as JP Morgan and Lehman Brothers began transforming the banking landscape, buying up mortgages and other forms of debt worth countless billions of dollars, and packaging these securities for buyers worldwide. Allied financial institutions became adept at selling cheap mortgages to ordinary people, creating an inflated housing market. Insurance and ratings companies bought in. The speed of growth and scale of this securities chain was unprecedented, recounts Ferguson -- as was its impact on the nation’s economy, both at the market’s peak, and after its collapse.
Ferguson provides a very detailed and pointed sidebar on industry incentives that underlay the wild growth years. These included allowing investment banks to bet on the failure of their own securities; and linking rating agencies’ income to their approval of risky securities. Individuals inside big institutions made out like bandits, because they could. Senior executives in places like Bear Stearns took out over $1 billion in cash each in the years prior to the 2008 collapse. The head of Countrywide Mortgage saw the end coming, and cashed out over $100 million in stock. Asks Ferguson, “Why was such extreme behavior permitted? I have to conclude there was a complete abdication on the part of the regulatory system.”
Ferguson finds galling both government apathy in regulating and in prosecuting high-end white collar crime, but perceives the reason: a financial services industry that “as it rapidly consolidated and concentrated became the dominant source not only of corporate profits but campaign contributions and political funding in the U.S.” Evidence for unrestrained financial power lies in the fact that the government response to the crisis has been engineered by Wall Street insiders intent on shoring up firms too big to fail. Ferguson cites as well “corruption of the economics discipline,” the rising role of money in politics, and the increasing concentration of wealth in the hands of a few.
The dominance of a single industry constitutes a deep change and danger for America, believes Ferguson. The nation “has evolved a political duopoly where two political parties agree on things related to finance and money.” Without a political structure immune to such influence, Ferguson sees little likelihood of challenging the interests of the financial giants.
Ferguson details the “securitization food chain,” a system of investing (and gambling) with debt that U.S. financial institutions enthusiastically adopted around 15 years ago. Encouraged by friendly government policies, a handful of investment behemoths such as JP Morgan and Lehman Brothers began transforming the banking landscape, buying up mortgages and other forms of debt worth countless billions of dollars, and packaging these securities for buyers worldwide. Allied financial institutions became adept at selling cheap mortgages to ordinary people, creating an inflated housing market. Insurance and ratings companies bought in. The speed of growth and scale of this securities chain was unprecedented, recounts Ferguson -- as was its impact on the nation’s economy, both at the market’s peak, and after its collapse.
Ferguson provides a very detailed and pointed sidebar on industry incentives that underlay the wild growth years. These included allowing investment banks to bet on the failure of their own securities; and linking rating agencies’ income to their approval of risky securities. Individuals inside big institutions made out like bandits, because they could. Senior executives in places like Bear Stearns took out over $1 billion in cash each in the years prior to the 2008 collapse. The head of Countrywide Mortgage saw the end coming, and cashed out over $100 million in stock. Asks Ferguson, “Why was such extreme behavior permitted? I have to conclude there was a complete abdication on the part of the regulatory system.”
Ferguson finds galling both government apathy in regulating and in prosecuting high-end white collar crime, but perceives the reason: a financial services industry that “as it rapidly consolidated and concentrated became the dominant source not only of corporate profits but campaign contributions and political funding in the U.S.” Evidence for unrestrained financial power lies in the fact that the government response to the crisis has been engineered by Wall Street insiders intent on shoring up firms too big to fail. Ferguson cites as well “corruption of the economics discipline,” the rising role of money in politics, and the increasing concentration of wealth in the hands of a few.
The dominance of a single industry constitutes a deep change and danger for America, believes Ferguson. The nation “has evolved a political duopoly where two political parties agree on things related to finance and money.” Without a political structure immune to such influence, Ferguson sees little likelihood of challenging the interests of the financial giants.
Categories: TemeTV
<img vspace="4" hspace="4" border="1"
Categories: TemeTV
Report Card on President Obama: MIT Experts Assess President Obama on Afghanistan, Climate, and the Economy
President Obama scored abysmally on his mid-terms. A trio of MIT professors renders harsh judgment on the president half-way through his administration, and their assessments may leave listeners “weeping or depressed,” in the words of moderator
Richard Samuels.
National security expert Barry Posen reviews the administration’s strategy and implementation of the war in Afghanistan. This conflict was adopted by the president and many Democrats as “the right war” following the wrong-headed invasion of Iraq, says Posen. But after investing tens of thousands more troops, and nearly $100 billion a year in Afghanistan, there remains uncertainty about how to complete the mission: to clear out the Taliban, secure critical regions, and build up a successful Afghan police force and government. While the Pentagon seems to support an “open-ended project aimed at defeating the Taliban,” the president appears intent on limiting the venture, with the aim of drawing down troops beginning in July 2011.
But Posen is skeptical of the overall project: Afghan politics are corrupt, rife with ethnic rivalries, and the administration is incompetent, so the idea of setting up a government “to compete with the Taliban probably won’t work well.” Though there are frequent reports of killing Taliban leaders, “many doubt the Taliban can be killed off as fast they regenerate,” and there is little chance of serious negotiation with them. The creation of a functioning Afghanistan “looks like a costly, lengthy gamble,” but the strategy is driven by politics, says Posen: “Democrats are quite concerned not to appear authors of defeat.”
The U.S. missed a vital opportunity to take the lead in addressing climate change, says Henry “Jake” Jacoby. Early on, the Obama administration “hurt prospects for progress,” putting healthcare reform first when it had a choice between “the health of the people and the planet.” And the administration didn’t forcefully back either the House or Senate versions of climate legislation, which attempted to produce an “economically rational” approach to pricing greenhouse gas emissions. Then came the recession, which doomed any chance for moving climate legislation forward, since it “made imposing costs very difficult,” says Jacoby.
What troubles him more is that the Obama administration has essentially “given the pulpit over to people against any action, and deniers.” Republicans seem to be winning the war of public opinion, claiming that measures against climate change will strangle the economy, and are now pressing to relieve the EPA of its power to regulate CO2. The “outlook is dark,” says Jacoby. “The word carbon is not said in polite company, and won’t be said in Washington.”
While it is a “terrific achievement” that we avoided another Great Depression, Simon Johnson is still “giving out failing grades” to this administration. Although Obama and his economic advisers basically got it right with the stimulus, they shockingly departed from best practices around banking policy, he believes. When major banks flounder, you close some of them down, fire managers, eliminate boards of directors, but “whatever you do, you cannot provide these banks with an unconditional bailout,” he says. Rewarding banks for bad behavior is plain shocking and leaves us in “a very awkward and unpleasant position.” By making banks too big to fail and sidestepping tough financial reform, he says, recovered banks will fight all the harder against any effort to be reined in. “By building implicit subsidy schemes into the structures in which banks survive,” we are stuck with “a few banks with excessive power,” and the “administration is responsible for setting us up for serious trouble down the road.”
Richard Samuels.
National security expert Barry Posen reviews the administration’s strategy and implementation of the war in Afghanistan. This conflict was adopted by the president and many Democrats as “the right war” following the wrong-headed invasion of Iraq, says Posen. But after investing tens of thousands more troops, and nearly $100 billion a year in Afghanistan, there remains uncertainty about how to complete the mission: to clear out the Taliban, secure critical regions, and build up a successful Afghan police force and government. While the Pentagon seems to support an “open-ended project aimed at defeating the Taliban,” the president appears intent on limiting the venture, with the aim of drawing down troops beginning in July 2011.
But Posen is skeptical of the overall project: Afghan politics are corrupt, rife with ethnic rivalries, and the administration is incompetent, so the idea of setting up a government “to compete with the Taliban probably won’t work well.” Though there are frequent reports of killing Taliban leaders, “many doubt the Taliban can be killed off as fast they regenerate,” and there is little chance of serious negotiation with them. The creation of a functioning Afghanistan “looks like a costly, lengthy gamble,” but the strategy is driven by politics, says Posen: “Democrats are quite concerned not to appear authors of defeat.”
The U.S. missed a vital opportunity to take the lead in addressing climate change, says Henry “Jake” Jacoby. Early on, the Obama administration “hurt prospects for progress,” putting healthcare reform first when it had a choice between “the health of the people and the planet.” And the administration didn’t forcefully back either the House or Senate versions of climate legislation, which attempted to produce an “economically rational” approach to pricing greenhouse gas emissions. Then came the recession, which doomed any chance for moving climate legislation forward, since it “made imposing costs very difficult,” says Jacoby.
What troubles him more is that the Obama administration has essentially “given the pulpit over to people against any action, and deniers.” Republicans seem to be winning the war of public opinion, claiming that measures against climate change will strangle the economy, and are now pressing to relieve the EPA of its power to regulate CO2. The “outlook is dark,” says Jacoby. “The word carbon is not said in polite company, and won’t be said in Washington.”
While it is a “terrific achievement” that we avoided another Great Depression, Simon Johnson is still “giving out failing grades” to this administration. Although Obama and his economic advisers basically got it right with the stimulus, they shockingly departed from best practices around banking policy, he believes. When major banks flounder, you close some of them down, fire managers, eliminate boards of directors, but “whatever you do, you cannot provide these banks with an unconditional bailout,” he says. Rewarding banks for bad behavior is plain shocking and leaves us in “a very awkward and unpleasant position.” By making banks too big to fail and sidestepping tough financial reform, he says, recovered banks will fight all the harder against any effort to be reined in. “By building implicit subsidy schemes into the structures in which banks survive,” we are stuck with “a few banks with excessive power,” and the “administration is responsible for setting us up for serious trouble down the road.”
Categories: TemeTV
Report Card on President Obama: MIT Experts Assess President Obama on Afghanistan, Climate, and the Economy
President Obama scored abysmally on his mid-terms. A trio of MIT professors renders harsh judgment on the president half-way through his administration, and their assessments may leave listeners “weeping or depressed,” in the words of moderator
Richard Samuels.
National security expert Barry Posen reviews the administration’s strategy and implementation of the war in Afghanistan. This conflict was adopted by the president and many Democrats as “the right war” following the wrong-headed invasion of Iraq, says Posen. But after investing tens of thousands more troops, and nearly $100 billion a year in Afghanistan, there remains uncertainty about how to complete the mission: to clear out the Taliban, secure critical regions, and build up a successful Afghan police force and government. While the Pentagon seems to support an “open-ended project aimed at defeating the Taliban,” the president appears intent on limiting the venture, with the aim of drawing down troops beginning in July 2011.
But Posen is skeptical of the overall project: Afghan politics are corrupt, rife with ethnic rivalries, and the administration is incompetent, so the idea of setting up a government “to compete with the Taliban probably won’t work well.” Though there are frequent reports of killing Taliban leaders, “many doubt the Taliban can be killed off as fast they regenerate,” and there is little chance of serious negotiation with them. The creation of a functioning Afghanistan “looks like a costly, lengthy gamble,” but the strategy is driven by politics, says Posen: “Democrats are quite concerned not to appear authors of defeat.”
The U.S. missed a vital opportunity to take the lead in addressing climate change, says Henry “Jake” Jacoby. Early on, the Obama administration “hurt prospects for progress,” putting healthcare reform first when it had a choice between “the health of the people and the planet.” And the administration didn’t forcefully back either the House or Senate versions of climate legislation, which attempted to produce an “economically rational” approach to pricing greenhouse gas emissions. Then came the recession, which doomed any chance for moving climate legislation forward, since it “made imposing costs very difficult,” says Jacoby.
What troubles him more is that the Obama administration has essentially “given the pulpit over to people against any action, and deniers.” Republicans seem to be winning the war of public opinion, claiming that measures against climate change will strangle the economy, and are now pressing to relieve the EPA of its power to regulate CO2. The “outlook is dark,” says Jacoby. “The word carbon is not said in polite company, and won’t be said in Washington.”
While it is a “terrific achievement” that we avoided another Great Depression, Simon Johnson is still “giving out failing grades” to this administration. Although Obama and his economic advisers basically got it right with the stimulus, they shockingly departed from best practices around banking policy, he believes. When major banks flounder, you close some of them down, fire managers, eliminate boards of directors, but “whatever you do, you cannot provide these banks with an unconditional bailout,” he says. Rewarding banks for bad behavior is plain shocking and leaves us in “a very awkward and unpleasant position.” By making banks too big to fail and sidestepping tough financial reform, he says, recovered banks will fight all the harder against any effort to be reined in. “By building implicit subsidy schemes into the structures in which banks survive,” we are stuck with “a few banks with excessive power,” and the “administration is responsible for setting us up for serious trouble down the road.”
Richard Samuels.
National security expert Barry Posen reviews the administration’s strategy and implementation of the war in Afghanistan. This conflict was adopted by the president and many Democrats as “the right war” following the wrong-headed invasion of Iraq, says Posen. But after investing tens of thousands more troops, and nearly $100 billion a year in Afghanistan, there remains uncertainty about how to complete the mission: to clear out the Taliban, secure critical regions, and build up a successful Afghan police force and government. While the Pentagon seems to support an “open-ended project aimed at defeating the Taliban,” the president appears intent on limiting the venture, with the aim of drawing down troops beginning in July 2011.
But Posen is skeptical of the overall project: Afghan politics are corrupt, rife with ethnic rivalries, and the administration is incompetent, so the idea of setting up a government “to compete with the Taliban probably won’t work well.” Though there are frequent reports of killing Taliban leaders, “many doubt the Taliban can be killed off as fast they regenerate,” and there is little chance of serious negotiation with them. The creation of a functioning Afghanistan “looks like a costly, lengthy gamble,” but the strategy is driven by politics, says Posen: “Democrats are quite concerned not to appear authors of defeat.”
The U.S. missed a vital opportunity to take the lead in addressing climate change, says Henry “Jake” Jacoby. Early on, the Obama administration “hurt prospects for progress,” putting healthcare reform first when it had a choice between “the health of the people and the planet.” And the administration didn’t forcefully back either the House or Senate versions of climate legislation, which attempted to produce an “economically rational” approach to pricing greenhouse gas emissions. Then came the recession, which doomed any chance for moving climate legislation forward, since it “made imposing costs very difficult,” says Jacoby.
What troubles him more is that the Obama administration has essentially “given the pulpit over to people against any action, and deniers.” Republicans seem to be winning the war of public opinion, claiming that measures against climate change will strangle the economy, and are now pressing to relieve the EPA of its power to regulate CO2. The “outlook is dark,” says Jacoby. “The word carbon is not said in polite company, and won’t be said in Washington.”
While it is a “terrific achievement” that we avoided another Great Depression, Simon Johnson is still “giving out failing grades” to this administration. Although Obama and his economic advisers basically got it right with the stimulus, they shockingly departed from best practices around banking policy, he believes. When major banks flounder, you close some of them down, fire managers, eliminate boards of directors, but “whatever you do, you cannot provide these banks with an unconditional bailout,” he says. Rewarding banks for bad behavior is plain shocking and leaves us in “a very awkward and unpleasant position.” By making banks too big to fail and sidestepping tough financial reform, he says, recovered banks will fight all the harder against any effort to be reined in. “By building implicit subsidy schemes into the structures in which banks survive,” we are stuck with “a few banks with excessive power,” and the “administration is responsible for setting us up for serious trouble down the road.”
Categories: TemeTV
Peace Meals
While breaking bread around the world with friends and families suffering through war and deprivation, Anna Badkhen managed to compile not just a vivid chronicle of lives under duress, but a cookbook. In this dialogue with MIT political scientist Fotini Christia, Badkhen describes her new work, Peace Meals: Candy-Wrapped Kalashnikovs and Other War Stories , in which by some Proustian process frontline reporting melds with tasty recipes.
Conflicts in Iraq, Afghanistan, the Middle East or Africa seem remote to most Americans, seen mainly through the lens of a news camera. In contrast, Badkhen takes “a quiet, intimate long look” into the living rooms of people under constant threat of violence and destitution. Badkhen’s persistence and patience over 10 years of reporting has won her friends in dangerous and ravaged lands. Peace Meals arose from a series of extended conversations about survival with her most memorable acquaintances -- over a good meal: “All that is holding us together are stories. And (my subjects) tell stories from their dinner tables.”
In the book, Badkhen describes the mingled experience of intimate talk and food preparation, as well as the complex stew of culture, history and politics that is a necessary part of each survivor’s story. No matter how extreme her subjects’ circumstances, “the more stripped down the house or kitchen, the more the emptiness was filled with extraordinary humanity and generosity.”
For her, each recipe or meal evokes a unique encounter and acquaintance. Dolma (stuffed grape leaves) calls up her Iraqi reporter friend and his family, who cooked with her in 2003 “while U.S. planes were bombing their hometown.” A hearty borscht summons the evening in 2002 when Russian authorities invaded a Moscow theater held by Chechen terrorists, leading to the death of 129 people. For Russians, this beet soup is “the ultimate comfort food, like donuts,” says Badkhen. Her friends “went for the borscht” because it was “hot, and protects you from the physical cold of living in a country that doesn’t care.” An American Army commander in Iraq shared his chow hall meal: a burger, corn dog, French fries and Jell-O. “He ate the same meal every day,” Badkhen says, regardless of whatever else was in the menu. “He felt each meal might be his last … If the day ends, and he is still alive, there will be the corn dog which will remind him of home.”
Conflicts in Iraq, Afghanistan, the Middle East or Africa seem remote to most Americans, seen mainly through the lens of a news camera. In contrast, Badkhen takes “a quiet, intimate long look” into the living rooms of people under constant threat of violence and destitution. Badkhen’s persistence and patience over 10 years of reporting has won her friends in dangerous and ravaged lands. Peace Meals arose from a series of extended conversations about survival with her most memorable acquaintances -- over a good meal: “All that is holding us together are stories. And (my subjects) tell stories from their dinner tables.”
In the book, Badkhen describes the mingled experience of intimate talk and food preparation, as well as the complex stew of culture, history and politics that is a necessary part of each survivor’s story. No matter how extreme her subjects’ circumstances, “the more stripped down the house or kitchen, the more the emptiness was filled with extraordinary humanity and generosity.”
For her, each recipe or meal evokes a unique encounter and acquaintance. Dolma (stuffed grape leaves) calls up her Iraqi reporter friend and his family, who cooked with her in 2003 “while U.S. planes were bombing their hometown.” A hearty borscht summons the evening in 2002 when Russian authorities invaded a Moscow theater held by Chechen terrorists, leading to the death of 129 people. For Russians, this beet soup is “the ultimate comfort food, like donuts,” says Badkhen. Her friends “went for the borscht” because it was “hot, and protects you from the physical cold of living in a country that doesn’t care.” An American Army commander in Iraq shared his chow hall meal: a burger, corn dog, French fries and Jell-O. “He ate the same meal every day,” Badkhen says, regardless of whatever else was in the menu. “He felt each meal might be his last … If the day ends, and he is still alive, there will be the corn dog which will remind him of home.”
Categories: TemeTV
Peace Meals
While breaking bread around the world with friends and families suffering through war and deprivation, Anna Badkhen managed to compile not just a vivid chronicle of lives under duress, but a cookbook. In this dialogue with MIT political scientist Fotini Christia, Badkhen describes her new work, Peace Meals: Candy-Wrapped Kalashnikovs and Other War Stories , in which by some Proustian process frontline reporting melds with tasty recipes.
Conflicts in Iraq, Afghanistan, the Middle East or Africa seem remote to most Americans, seen mainly through the lens of a news camera. In contrast, Badkhen takes “a quiet, intimate long look” into the living rooms of people under constant threat of violence and destitution. Badkhen’s persistence and patience over 10 years of reporting has won her friends in dangerous and ravaged lands. Peace Meals arose from a series of extended conversations about survival with her most memorable acquaintances -- over a good meal: “All that is holding us together are stories. And (my subjects) tell stories from their dinner tables.”
In the book, Badkhen describes the mingled experience of intimate talk and food preparation, as well as the complex stew of culture, history and politics that is a necessary part of each survivor’s story. No matter how extreme her subjects’ circumstances, “the more stripped down the house or kitchen, the more the emptiness was filled with extraordinary humanity and generosity.”
For her, each recipe or meal evokes a unique encounter and acquaintance. Dolma (stuffed grape leaves) calls up her Iraqi reporter friend and his family, who cooked with her in 2003 “while U.S. planes were bombing their hometown.” A hearty borscht summons the evening in 2002 when Russian authorities invaded a Moscow theater held by Chechen terrorists, leading to the death of 129 people. For Russians, this beet soup is “the ultimate comfort food, like donuts,” says Badkhen. Her friends “went for the borscht” because it was “hot, and protects you from the physical cold of living in a country that doesn’t care.” An American Army commander in Iraq shared his chow hall meal: a burger, corn dog, French fries and Jell-O. “He ate the same meal every day,” Badkhen says, regardless of whatever else was in the menu. “He felt each meal might be his last … If the day ends, and he is still alive, there will be the corn dog which will remind him of home.”
Conflicts in Iraq, Afghanistan, the Middle East or Africa seem remote to most Americans, seen mainly through the lens of a news camera. In contrast, Badkhen takes “a quiet, intimate long look” into the living rooms of people under constant threat of violence and destitution. Badkhen’s persistence and patience over 10 years of reporting has won her friends in dangerous and ravaged lands. Peace Meals arose from a series of extended conversations about survival with her most memorable acquaintances -- over a good meal: “All that is holding us together are stories. And (my subjects) tell stories from their dinner tables.”
In the book, Badkhen describes the mingled experience of intimate talk and food preparation, as well as the complex stew of culture, history and politics that is a necessary part of each survivor’s story. No matter how extreme her subjects’ circumstances, “the more stripped down the house or kitchen, the more the emptiness was filled with extraordinary humanity and generosity.”
For her, each recipe or meal evokes a unique encounter and acquaintance. Dolma (stuffed grape leaves) calls up her Iraqi reporter friend and his family, who cooked with her in 2003 “while U.S. planes were bombing their hometown.” A hearty borscht summons the evening in 2002 when Russian authorities invaded a Moscow theater held by Chechen terrorists, leading to the death of 129 people. For Russians, this beet soup is “the ultimate comfort food, like donuts,” says Badkhen. Her friends “went for the borscht” because it was “hot, and protects you from the physical cold of living in a country that doesn’t care.” An American Army commander in Iraq shared his chow hall meal: a burger, corn dog, French fries and Jell-O. “He ate the same meal every day,” Badkhen says, regardless of whatever else was in the menu. “He felt each meal might be his last … If the day ends, and he is still alive, there will be the corn dog which will remind him of home.”
Categories: TemeTV
Negotiating the Gulf Disaster
The Gulf Oil spill hurt many individuals and businesses, and there is broad agreement that they deserve compensation. But working out the nuances of damage payment is no simple matter, as Lawrence Susskind describes in conversation with an MIT Museum audience.
The $20 billion Oil Spill Compensation Fund, created by BP at the behest of the president, seemed a beneficial alternative to the endless litigation following the Exxon Valdez disaster. But even with the fund’s experienced “paymaster,” Ken Feinberg (who managed 9/11 claims), this approach to compensation is a “big experiment,” says Susskind. “We’re not used to doing it this way, at this scale.”
Susskind outlines and then raises several questions about the claims process. Feinberg has solicited documentation from victims that will demonstrate loss of income during the period of the spill. He has promised to write checks as fast as possible. But Susskind wonders if Feinberg, with his staff of 25, can sort through and reasonably assess the mountains of material claimants send in. Does the money go to those most in need, who have lost mortgages on their fishing boats or homes, or to condo developers in Florida who cannot sell their properties? Susskind wants to understand Feinberg’s “philosophical stand in regard to fairness.”
“Timing matters” as well. Susskind wonders about the tradeoffs between getting money to people quickly, and the need to examine claims with the kind of care required to avoid exploitation of the system. Should Feinberg “lower the standard of proof” to help people in urgent need? Finally, Susskind worries about the wisdom of separating compensation payments “from the issue of fact-finding with regard to fault, long-term environmental restoration, and punitive payments aimed at changing behaviors to avoid future accidents.” BP’s $20 billion payout does not really punish the corporation, which earns this much money in three months, says Susskind. So if the compensation plan does not help correct BP’s “risky behavior,” and send a message to the entire industry, what is the best means to avoid another Deepwater Horizon?
Susskind proposes one remedy to prevent continued slipshod practices in the offshore oil and gas industry. He cites the creation of the non-profit Institute for Nuclear Power Operations after the Three Mile Island accident, which the nuclear power industry developed “to police itself, so bad actors became the industry’s responsibility, not just the government’s.” Sloppy or unsafe nuclear plants can lose their insurance, and are “out of business,” says Susskind. Why not “take this analogy and apply it to offshore oil and gas,” he suggests.
The $20 billion Oil Spill Compensation Fund, created by BP at the behest of the president, seemed a beneficial alternative to the endless litigation following the Exxon Valdez disaster. But even with the fund’s experienced “paymaster,” Ken Feinberg (who managed 9/11 claims), this approach to compensation is a “big experiment,” says Susskind. “We’re not used to doing it this way, at this scale.”
Susskind outlines and then raises several questions about the claims process. Feinberg has solicited documentation from victims that will demonstrate loss of income during the period of the spill. He has promised to write checks as fast as possible. But Susskind wonders if Feinberg, with his staff of 25, can sort through and reasonably assess the mountains of material claimants send in. Does the money go to those most in need, who have lost mortgages on their fishing boats or homes, or to condo developers in Florida who cannot sell their properties? Susskind wants to understand Feinberg’s “philosophical stand in regard to fairness.”
“Timing matters” as well. Susskind wonders about the tradeoffs between getting money to people quickly, and the need to examine claims with the kind of care required to avoid exploitation of the system. Should Feinberg “lower the standard of proof” to help people in urgent need? Finally, Susskind worries about the wisdom of separating compensation payments “from the issue of fact-finding with regard to fault, long-term environmental restoration, and punitive payments aimed at changing behaviors to avoid future accidents.” BP’s $20 billion payout does not really punish the corporation, which earns this much money in three months, says Susskind. So if the compensation plan does not help correct BP’s “risky behavior,” and send a message to the entire industry, what is the best means to avoid another Deepwater Horizon?
Susskind proposes one remedy to prevent continued slipshod practices in the offshore oil and gas industry. He cites the creation of the non-profit Institute for Nuclear Power Operations after the Three Mile Island accident, which the nuclear power industry developed “to police itself, so bad actors became the industry’s responsibility, not just the government’s.” Sloppy or unsafe nuclear plants can lose their insurance, and are “out of business,” says Susskind. Why not “take this analogy and apply it to offshore oil and gas,” he suggests.
Categories: TemeTV
Humanities in the Digital Age
Reports of the demise of the humanities are exaggerated, suggest these panelists, but there may be reason to fear its loss of relevance. Three scholars whose work touches a variety of disciplines and with wide knowledge of the worlds of academia and publishing ponder the meaning and mission of the humanities in the digital age.
Getting a handle on the term itself proves somewhat elusive. Alison Byerly invokes those fields involved with “pondering the deep questions of humanity,” such as languages, the arts, literature, philosophy and religion. Steven Pinker boils it down to “the study of the products of the human mind.” Moderator David Thorburn wonders if the humanities are those endeavors that rely on interpretive rather than empirical research, but both panelists vigorously make the case that the liberal arts offer increasing opportunities for data-based analysis.
Technology is opening up new avenues for humanities scholars. In general, Byerly notes, humanities “tend to privilege individual texts or products of the human mind, rather than collective wisdom or data.” More recently, online collections or data bases of text, art and music make possible wholly different frameworks for study. Pinker cites his own use of automated text analysis in Google books to research the history of violence, tracing the rise and fall of such words as “glorious” and “honorable” -- connected in times past with nations’ war-making. Humanities scholars could routinely deploy tools like this to strengthen argument and interpretation, says Pinker, allowing them “to say things are warranted, true, coherent.”
Humanists are adopting new tools and methods for teaching and publishing as well. Byerly describes the freedom afforded her as a professor of Victorian literature when she can direct students to specific interactive websites for historical and cultural background, allowing her to focus on a specific novel in class. As Middlebury provost, she has been broadening the concept of publication to include work in different media online. However, as Pinker notes, the process for publishing articles in scholarly journals remains painfully slow: in experimental psychology, a “six year lag from having an idea to seeing it in print.” He suggests a “second look” at the process of peer review, perhaps publishing everything online, “and stuff that’s crummy sinks to the bottom as no one links to or reads it.” Pinker looks forward to a future where he no longer has to spend a “lot of time leafing through thick books” looking for text strings, or flipping to and from footnotes. “We could love books as much as we always have, but not necessarily confine ourselves to their limitations, which are just historical artifacts,” he says.
Such changes in the humanities may not come a moment too soon. In spite of relatively stable numbers of graduates in the U.S., the liberal arts may be increasingly endangered. Byerly sees “an inherent aura of remoteness about humanities: It studies the past, and distant past. At a time when technology seems to be speeding things up, bringing information to us faster, humanities’ pace doesn’t seem in tune with the times.” Pinker’s “nightmare scenario” is the “disaggregation of practical aspects of undergraduate education of students, and humanities,” akin to the way newspapers lost classified advertising. Humanities faculty, who tend not to bring in grants the way science faculty do, may prove an irresistible target for budget cutters. To protect these fields, Pinker proposes, integrate them with social sciences: connect English literature to the sciences of human nature, for instance, or music theory to auditory perception. “Make humanities’ faculty indispensable,” he urges.
Getting a handle on the term itself proves somewhat elusive. Alison Byerly invokes those fields involved with “pondering the deep questions of humanity,” such as languages, the arts, literature, philosophy and religion. Steven Pinker boils it down to “the study of the products of the human mind.” Moderator David Thorburn wonders if the humanities are those endeavors that rely on interpretive rather than empirical research, but both panelists vigorously make the case that the liberal arts offer increasing opportunities for data-based analysis.
Technology is opening up new avenues for humanities scholars. In general, Byerly notes, humanities “tend to privilege individual texts or products of the human mind, rather than collective wisdom or data.” More recently, online collections or data bases of text, art and music make possible wholly different frameworks for study. Pinker cites his own use of automated text analysis in Google books to research the history of violence, tracing the rise and fall of such words as “glorious” and “honorable” -- connected in times past with nations’ war-making. Humanities scholars could routinely deploy tools like this to strengthen argument and interpretation, says Pinker, allowing them “to say things are warranted, true, coherent.”
Humanists are adopting new tools and methods for teaching and publishing as well. Byerly describes the freedom afforded her as a professor of Victorian literature when she can direct students to specific interactive websites for historical and cultural background, allowing her to focus on a specific novel in class. As Middlebury provost, she has been broadening the concept of publication to include work in different media online. However, as Pinker notes, the process for publishing articles in scholarly journals remains painfully slow: in experimental psychology, a “six year lag from having an idea to seeing it in print.” He suggests a “second look” at the process of peer review, perhaps publishing everything online, “and stuff that’s crummy sinks to the bottom as no one links to or reads it.” Pinker looks forward to a future where he no longer has to spend a “lot of time leafing through thick books” looking for text strings, or flipping to and from footnotes. “We could love books as much as we always have, but not necessarily confine ourselves to their limitations, which are just historical artifacts,” he says.
Such changes in the humanities may not come a moment too soon. In spite of relatively stable numbers of graduates in the U.S., the liberal arts may be increasingly endangered. Byerly sees “an inherent aura of remoteness about humanities: It studies the past, and distant past. At a time when technology seems to be speeding things up, bringing information to us faster, humanities’ pace doesn’t seem in tune with the times.” Pinker’s “nightmare scenario” is the “disaggregation of practical aspects of undergraduate education of students, and humanities,” akin to the way newspapers lost classified advertising. Humanities faculty, who tend not to bring in grants the way science faculty do, may prove an irresistible target for budget cutters. To protect these fields, Pinker proposes, integrate them with social sciences: connect English literature to the sciences of human nature, for instance, or music theory to auditory perception. “Make humanities’ faculty indispensable,” he urges.
Categories: TemeTV
Humanities in the Digital Age
Reports of the demise of the humanities are exaggerated, suggest these panelists, but there may be reason to fear its loss of relevance. Three scholars whose work touches a variety of disciplines and with wide knowledge of the worlds of academia and publishing ponder the meaning and mission of the humanities in the digital age.
Getting a handle on the term itself proves somewhat elusive. Alison Byerly invokes those fields involved with “pondering the deep questions of humanity,” such as languages, the arts, literature, philosophy and religion. Steven Pinker boils it down to “the study of the products of the human mind.” Moderator David Thorburn wonders if the humanities are those endeavors that rely on interpretive rather than empirical research, but both panelists vigorously make the case that the liberal arts offer increasing opportunities for data-based analysis.
Technology is opening up new avenues for humanities scholars. In general, Byerly notes, humanities “tend to privilege individual texts or products of the human mind, rather than collective wisdom or data.” More recently, online collections or data bases of text, art and music make possible wholly different frameworks for study. Pinker cites his own use of automated text analysis in Google books to research the history of violence, tracing the rise and fall of such words as “glorious” and “honorable” -- connected in times past with nations’ war-making. Humanities scholars could routinely deploy tools like this to strengthen argument and interpretation, says Pinker, allowing them “to say things are warranted, true, coherent.”
Humanists are adopting new tools and methods for teaching and publishing as well. Byerly describes the freedom afforded her as a professor of Victorian literature when she can direct students to specific interactive websites for historical and cultural background, allowing her to focus on a specific novel in class. As Middlebury provost, she has been broadening the concept of publication to include work in different media online. However, as Pinker notes, the process for publishing articles in scholarly journals remains painfully slow: in experimental psychology, a “six year lag from having an idea to seeing it in print.” He suggests a “second look” at the process of peer review, perhaps publishing everything online, “and stuff that’s crummy sinks to the bottom as no one links to or reads it.” Pinker looks forward to a future where he no longer has to spend a “lot of time leafing through thick books” looking for text strings, or flipping to and from footnotes. “We could love books as much as we always have, but not necessarily confine ourselves to their limitations, which are just historical artifacts,” he says.
Such changes in the humanities may not come a moment too soon. In spite of relatively stable numbers of graduates in the U.S., the liberal arts may be increasingly endangered. Byerly sees “an inherent aura of remoteness about humanities: It studies the past, and distant past. At a time when technology seems to be speeding things up, bringing information to us faster, humanities’ pace doesn’t seem in tune with the times.” Pinker’s “nightmare scenario” is the “disaggregation of practical aspects of undergraduate education of students, and humanities,” akin to the way newspapers lost classified advertising. Humanities faculty, who tend not to bring in grants the way science faculty do, may prove an irresistible target for budget cutters. To protect these fields, Pinker proposes, integrate them with social sciences: connect English literature to the sciences of human nature, for instance, or music theory to auditory perception. “Make humanities’ faculty indispensable,” he urges.
Getting a handle on the term itself proves somewhat elusive. Alison Byerly invokes those fields involved with “pondering the deep questions of humanity,” such as languages, the arts, literature, philosophy and religion. Steven Pinker boils it down to “the study of the products of the human mind.” Moderator David Thorburn wonders if the humanities are those endeavors that rely on interpretive rather than empirical research, but both panelists vigorously make the case that the liberal arts offer increasing opportunities for data-based analysis.
Technology is opening up new avenues for humanities scholars. In general, Byerly notes, humanities “tend to privilege individual texts or products of the human mind, rather than collective wisdom or data.” More recently, online collections or data bases of text, art and music make possible wholly different frameworks for study. Pinker cites his own use of automated text analysis in Google books to research the history of violence, tracing the rise and fall of such words as “glorious” and “honorable” -- connected in times past with nations’ war-making. Humanities scholars could routinely deploy tools like this to strengthen argument and interpretation, says Pinker, allowing them “to say things are warranted, true, coherent.”
Humanists are adopting new tools and methods for teaching and publishing as well. Byerly describes the freedom afforded her as a professor of Victorian literature when she can direct students to specific interactive websites for historical and cultural background, allowing her to focus on a specific novel in class. As Middlebury provost, she has been broadening the concept of publication to include work in different media online. However, as Pinker notes, the process for publishing articles in scholarly journals remains painfully slow: in experimental psychology, a “six year lag from having an idea to seeing it in print.” He suggests a “second look” at the process of peer review, perhaps publishing everything online, “and stuff that’s crummy sinks to the bottom as no one links to or reads it.” Pinker looks forward to a future where he no longer has to spend a “lot of time leafing through thick books” looking for text strings, or flipping to and from footnotes. “We could love books as much as we always have, but not necessarily confine ourselves to their limitations, which are just historical artifacts,” he says.
Such changes in the humanities may not come a moment too soon. In spite of relatively stable numbers of graduates in the U.S., the liberal arts may be increasingly endangered. Byerly sees “an inherent aura of remoteness about humanities: It studies the past, and distant past. At a time when technology seems to be speeding things up, bringing information to us faster, humanities’ pace doesn’t seem in tune with the times.” Pinker’s “nightmare scenario” is the “disaggregation of practical aspects of undergraduate education of students, and humanities,” akin to the way newspapers lost classified advertising. Humanities faculty, who tend not to bring in grants the way science faculty do, may prove an irresistible target for budget cutters. To protect these fields, Pinker proposes, integrate them with social sciences: connect English literature to the sciences of human nature, for instance, or music theory to auditory perception. “Make humanities’ faculty indispensable,” he urges.
Categories: TemeTV
Negotiating the Gulf Disaster
The Gulf Oil spill hurt many individuals and businesses, and there is broad agreement that they deserve compensation. But working out the nuances of damage payment is no simple matter, as Lawrence Susskind describes in conversation with an MIT Museum audience.
The $20 billion Oil Spill Compensation Fund, created by BP at the behest of the president, seemed a beneficial alternative to the endless litigation following the Exxon Valdez disaster. But even with the fund’s experienced “paymaster,” Ken Feinberg (who managed 9/11 claims), this approach to compensation is a “big experiment,” says Susskind. “We’re not used to doing it this way, at this scale.”
Susskind outlines and then raises several questions about the claims process. Feinberg has solicited documentation from victims that will demonstrate loss of income during the period of the spill. He has promised to write checks as fast as possible. But Susskind wonders if Feinberg, with his staff of 25, can sort through and reasonably assess the mountains of material claimants send in. Does the money go to those most in need, who have lost mortgages on their fishing boats or homes, or to condo developers in Florida who cannot sell their properties? Susskind wants to understand Feinberg’s “philosophical stand in regard to fairness.”
“Timing matters” as well. Susskind wonders about the tradeoffs between getting money to people quickly, and the need to examine claims with the kind of care required to avoid exploitation of the system. Should Feinberg “lower the standard of proof” to help people in urgent need? Finally, Susskind worries about the wisdom of separating compensation payments “from the issue of fact-finding with regard to fault, long-term environmental restoration, and punitive payments aimed at changing behaviors to avoid future accidents.” BP’s $20 billion payout does not really punish the corporation, which earns this much money in three months, says Susskind. So if the compensation plan does not help correct BP’s “risky behavior,” and send a message to the entire industry, what is the best means to avoid another Deepwater Horizon?
Susskind proposes one remedy to prevent continued slipshod practices in the offshore oil and gas industry. He cites the creation of the non-profit Institute for Nuclear Power Operations after the Three Mile Island accident, which the nuclear power industry developed “to police itself, so bad actors became the industry’s responsibility, not just the government’s.” Sloppy or unsafe nuclear plants can lose their insurance, and are “out of business,” says Susskind. Why not “take this analogy and apply it to offshore oil and gas,” he suggests.
The $20 billion Oil Spill Compensation Fund, created by BP at the behest of the president, seemed a beneficial alternative to the endless litigation following the Exxon Valdez disaster. But even with the fund’s experienced “paymaster,” Ken Feinberg (who managed 9/11 claims), this approach to compensation is a “big experiment,” says Susskind. “We’re not used to doing it this way, at this scale.”
Susskind outlines and then raises several questions about the claims process. Feinberg has solicited documentation from victims that will demonstrate loss of income during the period of the spill. He has promised to write checks as fast as possible. But Susskind wonders if Feinberg, with his staff of 25, can sort through and reasonably assess the mountains of material claimants send in. Does the money go to those most in need, who have lost mortgages on their fishing boats or homes, or to condo developers in Florida who cannot sell their properties? Susskind wants to understand Feinberg’s “philosophical stand in regard to fairness.”
“Timing matters” as well. Susskind wonders about the tradeoffs between getting money to people quickly, and the need to examine claims with the kind of care required to avoid exploitation of the system. Should Feinberg “lower the standard of proof” to help people in urgent need? Finally, Susskind worries about the wisdom of separating compensation payments “from the issue of fact-finding with regard to fault, long-term environmental restoration, and punitive payments aimed at changing behaviors to avoid future accidents.” BP’s $20 billion payout does not really punish the corporation, which earns this much money in three months, says Susskind. So if the compensation plan does not help correct BP’s “risky behavior,” and send a message to the entire industry, what is the best means to avoid another Deepwater Horizon?
Susskind proposes one remedy to prevent continued slipshod practices in the offshore oil and gas industry. He cites the creation of the non-profit Institute for Nuclear Power Operations after the Three Mile Island accident, which the nuclear power industry developed “to police itself, so bad actors became the industry’s responsibility, not just the government’s.” Sloppy or unsafe nuclear plants can lose their insurance, and are “out of business,” says Susskind. Why not “take this analogy and apply it to offshore oil and gas,” he suggests.
Categories: TemeTV
Toward Efficient Airport Operations
Few of us would elect to spend countless hours at the airport watching planes arrive, depart and sit at gates. But what constitutes a punishment for some actually energizes Hamsa Balakrishnan, whose research focuses on improving airport operations. Her goal is to make air travel more efficient, robust and green.
Flying is quite frequently a trial these days, Balakrishnan acknowledges, with delays growing yearly, even during the recession when actual flights decreased. Congestion at the nation’s busiest airports is primarily responsible for these delays, which produce billion-dollar losses for the airlines, environmental damage as idling planes burn millions of gallons of fuel, and untold aggravation for passengers. If gridlock at these airports is not addressed, these problems will only worsen, says Balakrishnan, with the projected doubling by 2025 of the nation’s approximately 35 thousand daily flights.
With a team of students, Balakrishnan has been analyzing airport operations. Air traffic controllers must routinely separate plane landings by a few minutes, and balance the need for safety with maximum efficiency. With departures, controllers attempt to respond to pilots on a first come, first served basis, but must pause for arrivals if runways are busy, and must juggle take-off order if planes are due at other airports. The current model for scheduling, called constrained position shifting, says Balakrishnan, has “been conjectured to have exponential computational complexity,” and most important, does not seem the optimal method for controllers dealing with busy, real-time conditions.
Balakrishnan has recently broken through conventional scheduling complexities. Her approach involves developing simple, practical algorithms that improve takeoff and landing efficiency while factoring in typical aircraft arrival and departure protocol, and weather, among other factors. She is now testing her own scheduling models at Boston’s Logan Airport, at rush hour. So far, her team has achieved improvements in “runway throughput” equivalent to two-three extra flights per hour -- a 10-12% improvement in average flight delay.
She is also working on reducing the amount of time planes spend waiting in departure queues burning fuel, a phenomenon resulting from saturation in ground traffic. In tests with Boston controllers, her team used color-coded cards to signal when planes should actually push back from the gate and fire up their engines. By manipulating pushback rates, says Balakrishnan, you can significantly decrease the amount of fuel burned, reducing CO2 and particulate release. Controllers also felt things “flowed better,” she says. Next steps include a comprehensive evaluation of benefits, with an eye to developing “scalable control and optimization algorithms” for an increasingly busy aviation system.
Flying is quite frequently a trial these days, Balakrishnan acknowledges, with delays growing yearly, even during the recession when actual flights decreased. Congestion at the nation’s busiest airports is primarily responsible for these delays, which produce billion-dollar losses for the airlines, environmental damage as idling planes burn millions of gallons of fuel, and untold aggravation for passengers. If gridlock at these airports is not addressed, these problems will only worsen, says Balakrishnan, with the projected doubling by 2025 of the nation’s approximately 35 thousand daily flights.
With a team of students, Balakrishnan has been analyzing airport operations. Air traffic controllers must routinely separate plane landings by a few minutes, and balance the need for safety with maximum efficiency. With departures, controllers attempt to respond to pilots on a first come, first served basis, but must pause for arrivals if runways are busy, and must juggle take-off order if planes are due at other airports. The current model for scheduling, called constrained position shifting, says Balakrishnan, has “been conjectured to have exponential computational complexity,” and most important, does not seem the optimal method for controllers dealing with busy, real-time conditions.
Balakrishnan has recently broken through conventional scheduling complexities. Her approach involves developing simple, practical algorithms that improve takeoff and landing efficiency while factoring in typical aircraft arrival and departure protocol, and weather, among other factors. She is now testing her own scheduling models at Boston’s Logan Airport, at rush hour. So far, her team has achieved improvements in “runway throughput” equivalent to two-three extra flights per hour -- a 10-12% improvement in average flight delay.
She is also working on reducing the amount of time planes spend waiting in departure queues burning fuel, a phenomenon resulting from saturation in ground traffic. In tests with Boston controllers, her team used color-coded cards to signal when planes should actually push back from the gate and fire up their engines. By manipulating pushback rates, says Balakrishnan, you can significantly decrease the amount of fuel burned, reducing CO2 and particulate release. Controllers also felt things “flowed better,” she says. Next steps include a comprehensive evaluation of benefits, with an eye to developing “scalable control and optimization algorithms” for an increasingly busy aviation system.
Categories: TemeTV
Toward Efficient Airport Operations
Few of us would elect to spend countless hours at the airport watching planes arrive, depart and sit at gates. But what constitutes a punishment for some actually energizes Hamsa Balakrishnan, whose research focuses on improving airport operations. Her goal is to make air travel more efficient, robust and green.
Flying is quite frequently a trial these days, Balakrishnan acknowledges, with delays growing yearly, even during the recession when actual flights decreased. Congestion at the nation’s busiest airports is primarily responsible for these delays, which produce billion-dollar losses for the airlines, environmental damage as idling planes burn millions of gallons of fuel, and untold aggravation for passengers. If gridlock at these airports is not addressed, these problems will only worsen, says Balakrishnan, with the projected doubling by 2025 of the nation’s approximately 35 thousand daily flights.
With a team of students, Balakrishnan has been analyzing airport operations. Air traffic controllers must routinely separate plane landings by a few minutes, and balance the need for safety with maximum efficiency. With departures, controllers attempt to respond to pilots on a first come, first served basis, but must pause for arrivals if runways are busy, and must juggle take-off order if planes are due at other airports. The current model for scheduling, called constrained position shifting, says Balakrishnan, has “been conjectured to have exponential computational complexity,” and most important, does not seem the optimal method for controllers dealing with busy, real-time conditions.
Balakrishnan has recently broken through conventional scheduling complexities. Her approach involves developing simple, practical algorithms that improve takeoff and landing efficiency while factoring in typical aircraft arrival and departure protocol, and weather, among other factors. She is now testing her own scheduling models at Boston’s Logan Airport, at rush hour. So far, her team has achieved improvements in “runway throughput” equivalent to two-three extra flights per hour -- a 10-12% improvement in average flight delay.
She is also working on reducing the amount of time planes spend waiting in departure queues burning fuel, a phenomenon resulting from saturation in ground traffic. In tests with Boston controllers, her team used color-coded cards to signal when planes should actually push back from the gate and fire up their engines. By manipulating pushback rates, says Balakrishnan, you can significantly decrease the amount of fuel burned, reducing CO2 and particulate release. Controllers also felt things “flowed better,” she says. Next steps include a comprehensive evaluation of benefits, with an eye to developing “scalable control and optimization algorithms” for an increasingly busy aviation system.
Flying is quite frequently a trial these days, Balakrishnan acknowledges, with delays growing yearly, even during the recession when actual flights decreased. Congestion at the nation’s busiest airports is primarily responsible for these delays, which produce billion-dollar losses for the airlines, environmental damage as idling planes burn millions of gallons of fuel, and untold aggravation for passengers. If gridlock at these airports is not addressed, these problems will only worsen, says Balakrishnan, with the projected doubling by 2025 of the nation’s approximately 35 thousand daily flights.
With a team of students, Balakrishnan has been analyzing airport operations. Air traffic controllers must routinely separate plane landings by a few minutes, and balance the need for safety with maximum efficiency. With departures, controllers attempt to respond to pilots on a first come, first served basis, but must pause for arrivals if runways are busy, and must juggle take-off order if planes are due at other airports. The current model for scheduling, called constrained position shifting, says Balakrishnan, has “been conjectured to have exponential computational complexity,” and most important, does not seem the optimal method for controllers dealing with busy, real-time conditions.
Balakrishnan has recently broken through conventional scheduling complexities. Her approach involves developing simple, practical algorithms that improve takeoff and landing efficiency while factoring in typical aircraft arrival and departure protocol, and weather, among other factors. She is now testing her own scheduling models at Boston’s Logan Airport, at rush hour. So far, her team has achieved improvements in “runway throughput” equivalent to two-three extra flights per hour -- a 10-12% improvement in average flight delay.
She is also working on reducing the amount of time planes spend waiting in departure queues burning fuel, a phenomenon resulting from saturation in ground traffic. In tests with Boston controllers, her team used color-coded cards to signal when planes should actually push back from the gate and fire up their engines. By manipulating pushback rates, says Balakrishnan, you can significantly decrease the amount of fuel burned, reducing CO2 and particulate release. Controllers also felt things “flowed better,” she says. Next steps include a comprehensive evaluation of benefits, with an eye to developing “scalable control and optimization algorithms” for an increasingly busy aviation system.
Categories: TemeTV
Rebuilding Haiti
Difficult as it is to look beyond the acute misery of Haiti’s current crisis, Paul Farmer proposes that aid agencies and others concerned with rebuilding focus on the nation’s “old, chronic problems.” There’s no shortage of recovery ideas, he says, but these will go nowhere if they do not also advance the long-neglected, basic rights of Haitians.
Farmer describes efforts to respond to Haiti’s disastrous earthquake of January 2010, which killed hundreds of thousands, left 1.3 million homeless and much of the capital in ruins. Today, nearly a year later, the generous pledges of international aid have yet to materialize, says Farmer, and the peril has expanded to include a cholera outbreak. This picture is all the bleaker for the deaths of many of Farmer’s collaborators. The earthquake destroyed invaluable “human infrastructure”, says Farmer, including all the nursing students at Haiti’s one public nursing school.
Farmer has been working in Haiti for more than a decade, attempting to address not just malnutrition, HIV and tuberculosis, but larger issues such as Haitians’ lack of access to clean water, public education and healthcare. He would like to see international aid groups and foreign powers involved with Haiti recognize these issues in a meaningful way. Farmer’s long-standing strategy has been to engage Haiti’s public sector, or what remains after years of military and U.S. proxy rule, in the fight for these rights. He says, “There is always a role for the promotion of basic rights…The question is how to do this in the field, not just win an argument in seminar.”
The earthquake has profoundly deepened Haiti’s need for essential public institutions. The 1,000-plus tent cities housing more than a million people in Port-au-Prince are swelling, not diminishing, because people cannot find potable water anywhere else, and most have no idea where their next meal will come from. Yet there is a push to expel people from their tents and tarps, says Farmer, as if that will somehow speed construction of more permanent residences. Many plans are afoot for such housing, he says -- but few that take into account the desires of Haitians, who should have agency in shaping their own future. Rebuilding Haiti, Farmer believes, means “rebuilding aid machinery which is very broken, and often a damaging thing.” He is forging new alliances among Haitians and other aid partners, including Cubans and evangelical groups from the U.S., around water projects, and a new hospital that will be “big, green and public.” Says Farmer, “We must make common cause with those seeking to provide basic rights.”
Farmer describes efforts to respond to Haiti’s disastrous earthquake of January 2010, which killed hundreds of thousands, left 1.3 million homeless and much of the capital in ruins. Today, nearly a year later, the generous pledges of international aid have yet to materialize, says Farmer, and the peril has expanded to include a cholera outbreak. This picture is all the bleaker for the deaths of many of Farmer’s collaborators. The earthquake destroyed invaluable “human infrastructure”, says Farmer, including all the nursing students at Haiti’s one public nursing school.
Farmer has been working in Haiti for more than a decade, attempting to address not just malnutrition, HIV and tuberculosis, but larger issues such as Haitians’ lack of access to clean water, public education and healthcare. He would like to see international aid groups and foreign powers involved with Haiti recognize these issues in a meaningful way. Farmer’s long-standing strategy has been to engage Haiti’s public sector, or what remains after years of military and U.S. proxy rule, in the fight for these rights. He says, “There is always a role for the promotion of basic rights…The question is how to do this in the field, not just win an argument in seminar.”
The earthquake has profoundly deepened Haiti’s need for essential public institutions. The 1,000-plus tent cities housing more than a million people in Port-au-Prince are swelling, not diminishing, because people cannot find potable water anywhere else, and most have no idea where their next meal will come from. Yet there is a push to expel people from their tents and tarps, says Farmer, as if that will somehow speed construction of more permanent residences. Many plans are afoot for such housing, he says -- but few that take into account the desires of Haitians, who should have agency in shaping their own future. Rebuilding Haiti, Farmer believes, means “rebuilding aid machinery which is very broken, and often a damaging thing.” He is forging new alliances among Haitians and other aid partners, including Cubans and evangelical groups from the U.S., around water projects, and a new hospital that will be “big, green and public.” Says Farmer, “We must make common cause with those seeking to provide basic rights.”
Categories: TemeTV
Rebuilding Haiti
Difficult as it is to look beyond the acute misery of Haiti’s current crisis, Paul Farmer proposes that aid agencies and others concerned with rebuilding focus on the nation’s “old, chronic problems.” There’s no shortage of recovery ideas, he says, but these will go nowhere if they do not also advance the long-neglected, basic rights of Haitians.
Farmer describes efforts to respond to Haiti’s disastrous earthquake of January 2010, which killed hundreds of thousands, left 1.3 million homeless and much of the capital in ruins. Today, nearly a year later, the generous pledges of international aid have yet to materialize, says Farmer, and the peril has expanded to include a cholera outbreak. This picture is all the bleaker for the deaths of many of Farmer’s collaborators. The earthquake destroyed invaluable “human infrastructure”, says Farmer, including all the nursing students at Haiti’s one public nursing school.
Farmer has been working in Haiti for more than a decade, attempting to address not just malnutrition, HIV and tuberculosis, but larger issues such as Haitians’ lack of access to clean water, public education and healthcare. He would like to see international aid groups and foreign powers involved with Haiti recognize these issues in a meaningful way. Farmer’s long-standing strategy has been to engage Haiti’s public sector, or what remains after years of military and U.S. proxy rule, in the fight for these rights. He says, “There is always a role for the promotion of basic rights…The question is how to do this in the field, not just win an argument in seminar.”
The earthquake has profoundly deepened Haiti’s need for essential public institutions. The 1,000-plus tent cities housing more than a million people in Port-au-Prince are swelling, not diminishing, because people cannot find potable water anywhere else, and most have no idea where their next meal will come from. Yet there is a push to expel people from their tents and tarps, says Farmer, as if that will somehow speed construction of more permanent residences. Many plans are afoot for such housing, he says -- but few that take into account the desires of Haitians, who should have agency in shaping their own future. Rebuilding Haiti, Farmer believes, means “rebuilding aid machinery which is very broken, and often a damaging thing.” He is forging new alliances among Haitians and other aid partners, including Cubans and evangelical groups from the U.S., around water projects, and a new hospital that will be “big, green and public.” Says Farmer, “We must make common cause with those seeking to provide basic rights.”
Farmer describes efforts to respond to Haiti’s disastrous earthquake of January 2010, which killed hundreds of thousands, left 1.3 million homeless and much of the capital in ruins. Today, nearly a year later, the generous pledges of international aid have yet to materialize, says Farmer, and the peril has expanded to include a cholera outbreak. This picture is all the bleaker for the deaths of many of Farmer’s collaborators. The earthquake destroyed invaluable “human infrastructure”, says Farmer, including all the nursing students at Haiti’s one public nursing school.
Farmer has been working in Haiti for more than a decade, attempting to address not just malnutrition, HIV and tuberculosis, but larger issues such as Haitians’ lack of access to clean water, public education and healthcare. He would like to see international aid groups and foreign powers involved with Haiti recognize these issues in a meaningful way. Farmer’s long-standing strategy has been to engage Haiti’s public sector, or what remains after years of military and U.S. proxy rule, in the fight for these rights. He says, “There is always a role for the promotion of basic rights…The question is how to do this in the field, not just win an argument in seminar.”
The earthquake has profoundly deepened Haiti’s need for essential public institutions. The 1,000-plus tent cities housing more than a million people in Port-au-Prince are swelling, not diminishing, because people cannot find potable water anywhere else, and most have no idea where their next meal will come from. Yet there is a push to expel people from their tents and tarps, says Farmer, as if that will somehow speed construction of more permanent residences. Many plans are afoot for such housing, he says -- but few that take into account the desires of Haitians, who should have agency in shaping their own future. Rebuilding Haiti, Farmer believes, means “rebuilding aid machinery which is very broken, and often a damaging thing.” He is forging new alliances among Haitians and other aid partners, including Cubans and evangelical groups from the U.S., around water projects, and a new hospital that will be “big, green and public.” Says Farmer, “We must make common cause with those seeking to provide basic rights.”
Categories: TemeTV
A New Language for Mental Illness
Mental illness needs a “new narrative,” says
Jane Pauley. Just as cancer has moved from the shadows to pink ribbons and races for the cure, mental illness must shed its public aura of fear and shame. “Shrewd move; let’s do that,” says Pauley.
In a revealing and self-effacing talk, Pauley describes her own passage a decade ago from poster girl for NBC News to psychiatric patient. At 50, she was well aware of her reputation: “I could make no credible claim to being the best, hardest working, most beautiful in the industry. But honest, I owned normal. Or I thought I did.” So the “bombshell diagnosis” of bipolar disorder, brought on by steroid treatment for hives, and antidepressants, rocked her world.
It was a long struggle to crawl back from “the dark precipice of mental illness,” which included a period of hospitalization. And it did not help that her doctor was shocked that Pauley, who was writing an autobiography, wanted to discuss her condition in the book. In spite of such anguish and anxiety, Pauley says she “had hope” even from the beginning. Medicine helped, but Pauley also credits the capacity to open up about her situation with family and increasingly, in public forums. “When I’m heard talking comfortably about mental illness, as comfortably as talking about triple bypass surgery, I think I’m helping normalize mental illness. Normalizing is a much better word than destigmatizing. Change vocabulary, narrative; change minds, save lives,” she says.
Today Pauley sees a shift in how people regard mental illness, a new candor. Knowledge is the antidote to fear, she believes, and work “demystifying the brain is a step toward destigmatizing mental illness.” Her personal goal, she concludes, is to “banish ugly, out-of-date attitudes” and replace them with “new neural connections, positive associations. As they say, consciousness once raised cannot easily be lowered again.”
Jane Pauley. Just as cancer has moved from the shadows to pink ribbons and races for the cure, mental illness must shed its public aura of fear and shame. “Shrewd move; let’s do that,” says Pauley.
In a revealing and self-effacing talk, Pauley describes her own passage a decade ago from poster girl for NBC News to psychiatric patient. At 50, she was well aware of her reputation: “I could make no credible claim to being the best, hardest working, most beautiful in the industry. But honest, I owned normal. Or I thought I did.” So the “bombshell diagnosis” of bipolar disorder, brought on by steroid treatment for hives, and antidepressants, rocked her world.
It was a long struggle to crawl back from “the dark precipice of mental illness,” which included a period of hospitalization. And it did not help that her doctor was shocked that Pauley, who was writing an autobiography, wanted to discuss her condition in the book. In spite of such anguish and anxiety, Pauley says she “had hope” even from the beginning. Medicine helped, but Pauley also credits the capacity to open up about her situation with family and increasingly, in public forums. “When I’m heard talking comfortably about mental illness, as comfortably as talking about triple bypass surgery, I think I’m helping normalize mental illness. Normalizing is a much better word than destigmatizing. Change vocabulary, narrative; change minds, save lives,” she says.
Today Pauley sees a shift in how people regard mental illness, a new candor. Knowledge is the antidote to fear, she believes, and work “demystifying the brain is a step toward destigmatizing mental illness.” Her personal goal, she concludes, is to “banish ugly, out-of-date attitudes” and replace them with “new neural connections, positive associations. As they say, consciousness once raised cannot easily be lowered again.”
Categories: TemeTV
Open Payment, A New Approach to Public Transportation Fare Collection
Soon, a ticket to ride won’t require paper coupons, tokens, human vendors, or even Boston’s CharlieCard. Urban transit is abandoning a century old payment system for sophisticated digital payment technology, says George Kocur.
Kocur has been toiling for a decade on technology and methods that will enable transit industry operations to become more intelligent and efficient. He notes that many cities have already acquired an assortment of improvements -- smartphone apps, and GPS networks to keep transport on time, for instance. But these are often expensive proprietary services and products offered by a hodgepodge of vendors. Kocur makes the argument for developing non-proprietary systems, especially around fare payment, which could be utilized by multiple transit authorities, reducing costs over time.
He describes the evolution of a “generic e-collection technology framework,” based on a standardized, ’contactless’ payment card used in many stores. This card, bearing valid credit after an online or phone transaction, can serve riders as a monthly pass, or even a single trip ticket. It’s also very fast. In New York City tests, the e-collection card managed a transaction with a server via fiber optic network in 200-300 milliseconds on subway rides, and 400-800 milliseconds on that city’s buses (wireless data moves a tad slower). In contrast, Boston’s CharlieCard has a built-in chip that calculates the cost of the trip and debits it from the card, consuming valuable seconds.
Kocur is also working on a “fare engine” that maps “a set of card taps into a set of journey segments,” and groups these segments into trips, and trips into fares. Complex algorithms come into play, and the end result would permit riders real-time options on both journey-routing and fares. This software is flexible enough to work in London, New York and other cities, optimizing for each system’s travel network. To accommodate riders without bank credit, researchers are coming up with options including ATMs that accept cash to credit a fare card.
Kocur ultimately envisions piecing together “components that could be shared across transit systems,” perhaps even a single card accepted at transit agencies around the world. He hopes to demonstrate that “we no longer need something specific to each agency that’s expensive and difficult.” This would mean public transportation leaders talking to each other, as well as to banks and credit card companies. “It’s just about change in the transit industry, using technology as a lever,” he concludes.
Kocur has been toiling for a decade on technology and methods that will enable transit industry operations to become more intelligent and efficient. He notes that many cities have already acquired an assortment of improvements -- smartphone apps, and GPS networks to keep transport on time, for instance. But these are often expensive proprietary services and products offered by a hodgepodge of vendors. Kocur makes the argument for developing non-proprietary systems, especially around fare payment, which could be utilized by multiple transit authorities, reducing costs over time.
He describes the evolution of a “generic e-collection technology framework,” based on a standardized, ’contactless’ payment card used in many stores. This card, bearing valid credit after an online or phone transaction, can serve riders as a monthly pass, or even a single trip ticket. It’s also very fast. In New York City tests, the e-collection card managed a transaction with a server via fiber optic network in 200-300 milliseconds on subway rides, and 400-800 milliseconds on that city’s buses (wireless data moves a tad slower). In contrast, Boston’s CharlieCard has a built-in chip that calculates the cost of the trip and debits it from the card, consuming valuable seconds.
Kocur is also working on a “fare engine” that maps “a set of card taps into a set of journey segments,” and groups these segments into trips, and trips into fares. Complex algorithms come into play, and the end result would permit riders real-time options on both journey-routing and fares. This software is flexible enough to work in London, New York and other cities, optimizing for each system’s travel network. To accommodate riders without bank credit, researchers are coming up with options including ATMs that accept cash to credit a fare card.
Kocur ultimately envisions piecing together “components that could be shared across transit systems,” perhaps even a single card accepted at transit agencies around the world. He hopes to demonstrate that “we no longer need something specific to each agency that’s expensive and difficult.” This would mean public transportation leaders talking to each other, as well as to banks and credit card companies. “It’s just about change in the transit industry, using technology as a lever,” he concludes.
Categories: TemeTV
Open Payment, A New Approach to Public Transportation Fare Collection
Soon, a ticket to ride won’t require paper coupons, tokens, human vendors, or even Boston’s CharlieCard. Urban transit is abandoning a century old payment system for sophisticated digital payment technology, says George Kocur.
Kocur has been toiling for a decade on technology and methods that will enable transit industry operations to become more intelligent and efficient. He notes that many cities have already acquired an assortment of improvements -- smartphone apps, and GPS networks to keep transport on time, for instance. But these are often expensive proprietary services and products offered by a hodgepodge of vendors. Kocur makes the argument for developing non-proprietary systems, especially around fare payment, which could be utilized by multiple transit authorities, reducing costs over time.
He describes the evolution of a “generic e-collection technology framework,” based on a standardized, ’contactless’ payment card used in many stores. This card, bearing valid credit after an online or phone transaction, can serve riders as a monthly pass, or even a single trip ticket. It’s also very fast. In New York City tests, the e-collection card managed a transaction with a server via fiber optic network in 200-300 milliseconds on subway rides, and 400-800 milliseconds on that city’s buses (wireless data moves a tad slower). In contrast, Boston’s CharlieCard has a built-in chip that calculates the cost of the trip and debits it from the card, consuming valuable seconds.
Kocur is also working on a “fare engine” that maps “a set of card taps into a set of journey segments,” and groups these segments into trips, and trips into fares. Complex algorithms come into play, and the end result would permit riders real-time options on both journey-routing and fares. This software is flexible enough to work in London, New York and other cities, optimizing for each system’s travel network. To accommodate riders without bank credit, researchers are coming up with options including ATMs that accept cash to credit a fare card.
Kocur ultimately envisions piecing together “components that could be shared across transit systems,” perhaps even a single card accepted at transit agencies around the world. He hopes to demonstrate that “we no longer need something specific to each agency that’s expensive and difficult.” This would mean public transportation leaders talking to each other, as well as to banks and credit card companies. “It’s just about change in the transit industry, using technology as a lever,” he concludes.
Kocur has been toiling for a decade on technology and methods that will enable transit industry operations to become more intelligent and efficient. He notes that many cities have already acquired an assortment of improvements -- smartphone apps, and GPS networks to keep transport on time, for instance. But these are often expensive proprietary services and products offered by a hodgepodge of vendors. Kocur makes the argument for developing non-proprietary systems, especially around fare payment, which could be utilized by multiple transit authorities, reducing costs over time.
He describes the evolution of a “generic e-collection technology framework,” based on a standardized, ’contactless’ payment card used in many stores. This card, bearing valid credit after an online or phone transaction, can serve riders as a monthly pass, or even a single trip ticket. It’s also very fast. In New York City tests, the e-collection card managed a transaction with a server via fiber optic network in 200-300 milliseconds on subway rides, and 400-800 milliseconds on that city’s buses (wireless data moves a tad slower). In contrast, Boston’s CharlieCard has a built-in chip that calculates the cost of the trip and debits it from the card, consuming valuable seconds.
Kocur is also working on a “fare engine” that maps “a set of card taps into a set of journey segments,” and groups these segments into trips, and trips into fares. Complex algorithms come into play, and the end result would permit riders real-time options on both journey-routing and fares. This software is flexible enough to work in London, New York and other cities, optimizing for each system’s travel network. To accommodate riders without bank credit, researchers are coming up with options including ATMs that accept cash to credit a fare card.
Kocur ultimately envisions piecing together “components that could be shared across transit systems,” perhaps even a single card accepted at transit agencies around the world. He hopes to demonstrate that “we no longer need something specific to each agency that’s expensive and difficult.” This would mean public transportation leaders talking to each other, as well as to banks and credit card companies. “It’s just about change in the transit industry, using technology as a lever,” he concludes.
Categories: TemeTV
A New Language for Mental Illness
Mental illness needs a “new narrative,” says
Jane Pauley. Just as cancer has moved from the shadows to pink ribbons and races for the cure, mental illness must shed its public aura of fear and shame. “Shrewd move; let’s do that,” says Pauley.
In a revealing and self-effacing talk, Pauley describes her own passage a decade ago from poster girl for NBC News to psychiatric patient. At 50, she was well aware of her reputation: “I could make no credible claim to being the best, hardest working, most beautiful in the industry. But honest, I owned normal. Or I thought I did.” So the “bombshell diagnosis” of bipolar disorder, brought on by steroid treatment for hives, and antidepressants, rocked her world.
It was a long struggle to crawl back from “the dark precipice of mental illness,” which included a period of hospitalization. And it did not help that her doctor was shocked that Pauley, who was writing an autobiography, wanted to discuss her condition in the book. In spite of such anguish and anxiety, Pauley says she “had hope” even from the beginning. Medicine helped, but Pauley also credits the capacity to open up about her situation with family and increasingly, in public forums. “When I’m heard talking comfortably about mental illness, as comfortably as talking about triple bypass surgery, I think I’m helping normalize mental illness. Normalizing is a much better word than destigmatizing. Change vocabulary, narrative; change minds, save lives,” she says.
Today Pauley sees a shift in how people regard mental illness, a new candor. Knowledge is the antidote to fear, she believes, and work “demystifying the brain is a step toward destigmatizing mental illness.” Her personal goal, she concludes, is to “banish ugly, out-of-date attitudes” and replace them with “new neural connections, positive associations. As they say, consciousness once raised cannot easily be lowered again.”
Jane Pauley. Just as cancer has moved from the shadows to pink ribbons and races for the cure, mental illness must shed its public aura of fear and shame. “Shrewd move; let’s do that,” says Pauley.
In a revealing and self-effacing talk, Pauley describes her own passage a decade ago from poster girl for NBC News to psychiatric patient. At 50, she was well aware of her reputation: “I could make no credible claim to being the best, hardest working, most beautiful in the industry. But honest, I owned normal. Or I thought I did.” So the “bombshell diagnosis” of bipolar disorder, brought on by steroid treatment for hives, and antidepressants, rocked her world.
It was a long struggle to crawl back from “the dark precipice of mental illness,” which included a period of hospitalization. And it did not help that her doctor was shocked that Pauley, who was writing an autobiography, wanted to discuss her condition in the book. In spite of such anguish and anxiety, Pauley says she “had hope” even from the beginning. Medicine helped, but Pauley also credits the capacity to open up about her situation with family and increasingly, in public forums. “When I’m heard talking comfortably about mental illness, as comfortably as talking about triple bypass surgery, I think I’m helping normalize mental illness. Normalizing is a much better word than destigmatizing. Change vocabulary, narrative; change minds, save lives,” she says.
Today Pauley sees a shift in how people regard mental illness, a new candor. Knowledge is the antidote to fear, she believes, and work “demystifying the brain is a step toward destigmatizing mental illness.” Her personal goal, she concludes, is to “banish ugly, out-of-date attitudes” and replace them with “new neural connections, positive associations. As they say, consciousness once raised cannot easily be lowered again.”
Categories: TemeTV