Show Posts

This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.


Topics - Agrul

Pages: 1 2 3 4 5 [6] 7 8 9 10 11 ... 22
151
General Discussion / Free-flowing water on Mars?
« on: December 23, 2013, 01:15:06 PM »
Before it lost its atmosphere Mars was covered with liquid water. There is now evidence that certain areas of the red planet may still have free flowing water during certain times of the year. The results come from Alfred McEwen of the University of Arizona and were published in Nature Geosciences.

In 2011, NASA’s Mars Reconnaissance Orbiter (MRO) provided images that showed dark streaks in the soil in areas near the equator. Though they faded over time, the streaks return at the warmest part of each year. The most likely answer is that there could very well be liquid water flowing on the surface of Mars under certain circumstances; an unprecedented discovery, since the atmosphere is much too thin to retain liquid water for long periods of time.

The trouble is, researchers don’t know enough about Martian geology and composition to definitively say where the water could be coming from. There could be pockets of ice underneath the surface that liquify when warmed, but the emergence pattern of the dark streaks doesn’t seem to suggest that. There is also a possibility that the streaks are caused by water vapor being pulled from the atmosphere and condensed into the soil.

For life as we know it to exist, liquid water is a must. The fact that there is still liquid water on the surface of Mars is very exciting, but space agencies need to proceed carefully. Any probes that visit these potentially watery areas must be completely sterilized which is a complicated and expensive procedure. If there were any traces of Earth microbes on the probe, it could easily contaminate that which it was sent to study. The Committee on Space Research (COSPAR) is part of an international organization that defines good research practices in space and would shut down any mission that did not ensure the utmost of cleanliness for the spacecraft.

Of course, completely eliminating microbes is incredibly expensive. To completely prevent contamination, the probe would need to be heated using hydrogen peroxide vapor or ionized radiation to kill anything that might be stuck on the space craft. Similar treatments were performed on Voyager 1 & 2 and were about 10% of the entire budget that has spanned over 35 years. For a probe that would be able to land on Mars and analyze potential liquid water samples, the sterilization price tag could be too great to overcome for any one agency. Eventually, though, the mission will be done correctly, and having accurate and meaningful data will be well worth the investment.

http://www.iflscience.com/space/free-flowing-water-discovered-equator-mars

http://link.brightcove.com/services/player/bcpid1399191810?bctid=2917976474001

152
Agrulian Archives / Number Theory (undergrad student level)
« on: December 23, 2013, 12:28:33 PM »
Brief Subject Overview: number theory is the study of the integers, and as such is really accessible, at least at first, without the need for a lot of higher mathematics. It's a bit like graph theory in that regard, although also like graph theory it becomes surprisingly entwined with e.g. continuous mathematics pretty quickly. Number theory's one of the oldest branches of math, and apparently Gauss wanted to fellate number theory pretty badly, but I have almost no appreciation for the subject, never having formally studied it, read very much about it, taken any coursework on it, etc. This thread's meant to fill that gap.

Text(s): Hardy & Wright's "An Intro to the Theory of Numbers" (also see here for a PDF version).

Assigned problems:

TBA

Replies will contain worked solutions, discussion, etc.

153
Spamalot / must-see movies of the last decade
« on: December 23, 2013, 01:05:19 AM »
also merely entertaining & fun entries acceptable

let's hear em

The Other Guys vis-a-vis taket got me thinking about it. enjoying the shit out of this movie but had never heard of it, assuming there's a ton of other shit i haven't heard of too

154
Physicists at the University of Chicago and the University of Massachusetts, Amherst, are uncovering the fundamental physical laws that govern the behavior of cellular materials.

We don't have any tools or formalism to think about these types of materials, and that's what we've been trying to go after," said Margaret Gardel, professor in physics at UChicago.
Gardel and Jennifer Ross of the University of Massachusetts, Amherst, are supported in this work by a four-year, $800,000 INSPIRE grant from the National Science Foundation.

Gardel studies the building blocks of the cytoskeleton -- the materials inside a cell that provide its shape and allow it to move -- by extracting proteins from the cell and studying how they interact in vitro. "These materials are what makes living cells living materials and not dead materials," said Gardel.

Biological materials behave differently than non-living materials because, unlike conventional materials, they are not in a state of equilibrium -- they constantly consume energy and do work with that energy. Studying the unique physics of such materials is interesting in its own right and could allow physicists to produce novel materials for applications outside the lab. "We are trying to take advantage of what is intrinsically new that these materials can do, that cannot be done by equilibrium material," said physics graduate student Patrick McCall, a member of Gardel's lab.

The NSF created the INSPIRE grants to fund interdisciplinary research that is innovative, and perhaps risky, but which could lead to big leaps in understanding. Gardel and Ross's research fits the bill because the scientists are working in uncharted territory -- without theories to guide their way -- as these materials are still poorly understood. "These systems are a part of nature that physics is not great at describing," Ross said.

Through their work, Gardel and Ross plan to catalog the phases of biological materials. Just as traditional materials can be in a solid, liquid, or gas phase, biological materials may have multiple phases where the materials behave in drastically different ways. Rather than temperature and pressure causing phase changes, the important variables to study might be the concentration or types of proteins in the mix.

One example of a phase transition in cellular material occurs with contractility, a quality that Gardel has studied extensively in her lab. Give biological materials the right conditions and they can contract, but change those conditions and suddenly contraction is no longer possible. Cells must contract to move, adhere to surfaces, and generate forces. Although contraction in skeletal and cardiac muscle cells has been well-understood for 60 years, finding a model of contraction for other types of cells has proven tricky. "That was sort of what I got stuck on when I started my lab," said Gardel.

Contraction in skeletal and cardiac muscle cells is generated by filaments made up of the protein actin in combination with the motor protein myosin. The myosin molecules walk along the filaments of actin, pulling the actin closer together and generating contraction along the filament. This model requires a precisely aligned network of actin and myosin, so that each tug causes the filament to get shorter, not longer. But this model does not explain contraction in other types of cells, which have disordered networks of actin. Research in Gardel's lab showed that the actin behaves like a rope, buckling out of the way when compressed, but resisting tension when pulled. This allows the whole network to compress, despite its disorder.

Gardel's research on contraction earned her a $450,000 Early Excellence Award from the American Asthma Foundation last year, which will allow her to study whether contraction in airway muscle cells can also be described by her model.
While Gardel's research has focused on actin, which is only one component of the cytoskeletal structure. Ross studies microtubules, which she describes as the "bones" of the cell, whereas actin is the muscle. "Much like the bones and muscles of your body work together to allow you to move, we believe the microtubules and actin cytoskeletons work in concert to enable cell shape changes and motility," said Ross.

Gardel's research brings the strategies of condensed matter physics (the physics of liquids and solids) to biological materials, with the goal of understanding how cells behave from the bottom up. The INSPIRE grant will allow Ross and Gardel to rigorously test cellular models and explore the underlying physics. "There are lots of theories out there about how these cytoskeletal structures are regulated, but very little quantitative physical data," Gardel said.

http://www.sciencedaily.com/releases/2013/12/131218143647.htm?utm_source=feedburner&utm_medium=email&utm_campaign=Feed%3A+sciencedaily%2Fmatter_energy%2Fphysics+%28ScienceDaily%3A+Matter+%26+Energy+News+--+Physics%29

155
General Discussion / Canada : Prostitutes, eh?
« on: December 20, 2013, 06:28:39 PM »
Canada's highest court struck down the country's anti-prostitution laws in their entirety Friday, including against keeping a brothel.

The 9-0 Supreme Court ruling is a victory for sex workers seeking safer working conditions because it found that the laws violated the guarantee to life, liberty and security of the person. But the ruling won't take effect immediately because it gave Parliament a one-year reprieve to respond with new legislation.

Prostitution isn't illegal in Canada, but many of the activities associated with prostitution are classified as criminal offenses.


The high court struck down all three prostitution-related laws: against keeping a brothel, living on the avails of prostitution, and street soliciting. The landmark ruling comes more than two decades after the Supreme Court last upheld the country's anti-prostitution laws.

The decision upheld an Ontario Court of Appeal ruling last year that struck down the ban on brothels on the grounds that it endangered sex workers by forcing them onto the streets.

Chief Justice Beverley McLachlin, writing on behalf of the court, said Canada's social landscape has changed since 1990, when the Supreme Court upheld a ban on street solicitation.

"These appeals and the cross-appeal are not about whether prostitution should be legal or not," she wrote. "They are about whether the laws Parliament has enacted on how prostitution may be carried out pass constitutional muster. I conclude that they do not."

A Vancouver sex worker who was part of a group that brought the case applauded the court's decision.

"I'm shocked and pleased that our sex laws will not cause us harm in a year," Amy Lebovitch said in a news conference.

Katrina Pacey, a lawyer for the group of downtown Vancouver prostitutes, called it "an unbelievably important day for the sex workers but also for human rights."

"The court recognized that sex workers have the right to protect themselves and their safety," she said.

In 1990, the two women on Canada's Supreme Court dissented on the ruling upholding the ban on street solicitation. This time, all six men on the court justices sided with their three female colleagues.

"The harms identified by the courts below are grossly disproportionate to the deterrence of community disruption that is the object of the law," McLachlin wrote. "Parliament has the power to regulate against nuisances, but not at the cost of the health, safety and lives of prostitutes."

Sex-trade workers argued that much has changed since the high court last considered prostitution, including the horrific serial killings of prostitutes by Robert Pickton in British Columbia.

http://www.usatoday.com/story/news/world/2013/12/20/canada-anti-prostitution/4142685/

156
General Discussion / this is a thread where i am wrong
« on: December 20, 2013, 06:07:29 PM »
i was completely wrong

Quote from: Agrulberry the Erudite
czer called warren's wife's breast milk 'homeopathy' in the thread about it, and i thought i looked it up at the time and that it seemed like a reasonable usage. it looks like that's completely the opposite of true though so i concede that you are right and i am wrong

fuuuuck i'm wrong i'm a liar i'm the dipshit

i didn't realize the currency had a fixed cap and/or that the growth rate was designed to reduce over time. i thought it obeyed some kind of constant 2% friedman rule or something

i understand now + retract my confusion

yeah you were right about it, AD

Good points.

Didn't realize (well, now that you mention this I recall seeing pictures like the one you linked; I suppose I forgot) myelin was primarily around axons and not the neuronal body or dendrites....

I hadn't noticed that the SR wiki tries to give a specific definition of noise in that context! That's what I get for reading too quickly.

oh right you're in the employer insurance market, my bad

Actually, I've just realized that I should correct myself; I agree with your interpretation and apologize for not noticing my misreading earlier.

oh that's a good point; i was being quite clumsy using them separately

i have no idea what kind of arithmetic i was doing there taket, thanks for pointing that out; you are correct

i guess really it's the per-capita i'm whining about, come to think of it, not the PPP

157
Spamalot / i cant find my cell phone
« on: December 19, 2013, 06:28:47 PM »
guys i cant find my cell phone

158
General Discussion / hey zeke why do you hate solayce
« on: December 19, 2013, 05:36:03 PM »
why do you hate solayce

159
General Discussion / The FEC is rotting from the inside out
« on: December 19, 2013, 03:14:59 PM »
Money is flooding into federal elections in the post-Citizens United era.  And yet the agency tasked with monitoring and regulating all of that activity is close to crippled due to staff cuts and partisan bickering.

That’s according to Dave Levinthal of the Center for Public Integrity, which released a massive analysis on the Federal Election Commission and its problems earlier this week.  Among the problems with the agency Levinthal identified include:

Quote
The commission over the past year has reached a paralyzing all-time low in its ability to reach consensus, stalling action on dozens of rulemaking, audit and enforcement matters, some of which are years old.

Despite an explosion in political spending hastened by key Supreme Court decisions, the agency’s funding has remained flat for five years and staffing levels have fallen to a 15-year low.

Analysts charged with scouring disclosure reports to ensure candidates and political committees are complying with laws have a nearly quarter-million-page backlog
.

Average people — heck, average political junkies — have either never heard of the FEC, don’t really know what it does or both. But, remember this is the rule-making and rule-enforcing entity for all federal money in politics. Also remember that we live in an age in which public financing of presidential elections is a thing of the past — 2012 is the first election since Watergate where neither major party nominee accepted public funds for the general election  – and, thanks to super PACs, wealthy individuals have more power than ever. The price tag for the 2012 election topped $6 billion, according to the Center for Responsive Politics, and the trend line suggests that there is nowhere to go but up.



More money flowing into politics + understaffed agency riven by partisan divides = recipe for chaos. Or, as Levinthal puts it: “As the nation heads into what will undoubtedly be the most expensive midterm election in history and a 2016 presidential election that, in no small way, has already begun, the FEC is rotting from the inside out.”

http://www.washingtonpost.com/blogs/the-fix/wp/2013/12/18/the-most-important-political-2story-you-havent-heard-about/

160
Agrulian Archives / Trigonometry (high-school level)
« on: December 19, 2013, 01:55:51 PM »
Brief Subject Overview: trigonometry is the ancient studies of angles and lengths in triangles. Most of us take it in middle or high school; I slept through most of my high school class. What I picked up osmotically has been more than enough to get by, with the occasional supplement, but it'd be nice to have a firmer foundation in trig, and for its various proofs to feel more familiar; it would be especially nice if proofs of the more obscure multiple-angle identities and such started to sink in and feel natural. (bonus: got the text from a retiring faculty member's give-away stack!)

Text(s): Sullivan's "Algebra & Trigonometry" (chs 6-8).

Assigned problems:

Replies will contain worked solutions, discussion, etc.

161
Agrulian Archives / Stochastic Calculus (grad student level)
« on: December 19, 2013, 02:03:42 AM »
Brief Subject Overview: stochatic calculus is a generalization of elementary calculus; the idea is to provide a theory for how to do integration when the integrating variable is assumed to vary stochastically as a stochastic process ("dW_s"), as opposed to the deterministic infinitesmal elements ("dx") appearing in standard Riemann integrals or the deterministic fxn-infinitesmals appearing in Riemann-Stieljtes integrals ("dg(x)") . Standard calculus has particular trouble with stochastic integrals because the most commonly used stochastic processes---e.g. Brownian motion---tend to be almost everywhere non-differentiable, and a fortiori vary too much for standard Riemann integral definitions (if you throw an expected value into them) to yield values independent of the point selected from within a partition. The Ito or Stochastic calculus deals with this by considering both first-order and second-order variations in its integral definition. Stochastic calculus is used to solve models of stock prices in mathematical finance, where stochastic integrals like the Ito integral are used to generate "stochastic differential equations" (which are really integral equations, since there's no differentiation defined in stoch calc -- but the formal analogy to ODEs is natural, I guess).

Text(s): not sure. Have a range of texts covering everything from 'babby's first math' to 'just commit suicide now, you're going to want to if you read me.' Gotta figure out what the most rewarding one is for my current knowledge.

Assigned problems:

TBA

Replies will contain worked solutions, discussion, etc.

162
Agrulian Archives / Markov Chains/Processes (grad student level)
« on: December 19, 2013, 01:55:38 AM »
Brief Subject Overview: Markov chains are a particularly well-studied kind of stochastic process in which the probability of observing any state s at time t+1 depends only on the state t time t; that is, there is some time dependence, but it only goes one period back. I understand Markov chains at a casual level but could really use some more rigorous and painful exposure to them. And so..

Text(s): MTG's "Markov Chains & Stochastic Stability".

Assigned problems:

TBA

Replies will contain worked solutions, discussion, etc.

163
Agrulian Archives / Stochastic Processes (grad student level)
« on: December 19, 2013, 01:52:23 AM »
Brief Subject Overview: stochastic processes are fundamental for modeling dependent sequences of random events occurring in time or space, and are foundational for more advanced topics, like development of stochastic calculus. I've got a basic appreciation for them but my understanding tails off quickly as the measure theory or continuity of time  rises. Bass should fix t hat.

Text(s): Bass's "Stochastic Processes".

Assigned problems:

TBA

Replies will contain worked solutions, discussion, etc.

164
General Discussion / N.C. shows how to Crush the Unemployed
« on: December 18, 2013, 04:35:13 PM »
The U.S. is about to cut the maximum duration of public support for the unemployed. The federal extension of unemployment insurance expires on Jan. 1. To see the consequences, look at North Carolina.

I’ve been watching the state since July, when it cut the maximum length of benefit from 99 weeks to just 19, and reduced the weekly check from $535 to $350.


Across the country, the unemployed will lose from 14 to 47 weeks of insurance when the extension ends. Five other states will join North Carolina in providing fewer than 26 weeks of payments -- the standard in the U.S until this year. What’s happened in North Carolina since July is an indication of what will happen nationwide. The picture is troubling.

As intended, presumably, the number of North Carolinians receiving unemployment benefits has collapsed. It’s down by 45,000, or 40 percent, since last year. Expiring benefits aren’t the only reason for this. Far fewer are filing a claim in the first place. Initial claims are running at about half last year's rate. Unemployment insurance is a thinner safety net than it has been in decades.

In addition, North Carolina’s labor force began to shrink. The state is experiencing the largest labor-force contraction it's ever seen -- 77,000 fewer people were working or searching for work this October than a year ago. This should, but won’t, settle a partisan debate. Cutting unemployment insurance apparently hasn’t encouraged the unemployed to look harder for work: It has caused them to drop out of the labor force altogether.

To get unemployment insurance, you have to actively search for work and prove that you're doing so. The drop in the labor force suggests that this incentive was effective. Without it, more people just give up.

Meanwhile, the burden of easing the financial distress caused by unemployment has shifted from public programs to private charities. According to Alan Briggs, executive director of the North Carolina Association of Food Banks, they're struggling to cope.

"The local pantries are saying, 'Give us more, give us more, give us more,'" Briggs said. "All that the county social workers can do now is give those in need the phone number for the local food bank." As he told a local news station, his food banks had been "asked to be the safety net of the safety net."

Ron Pringle, a food-bank director who oversees seven counties and 230 organizations in the state’s southeast, says they’ve seen on average a 17 percent increase in need since last year. "We’re seeing requests for food from our agencies well outside of our planned growth," Pringle said. "Some of our member agencies have been able to meet that need, but many have not."

"They’ve had to expand pantry hours, add additional days to the schedule, and take on new volunteers because they’re unable to meet the greater need," he said. "These decisions have created a whole new community of folks we're going to have to serve."

Some 1.3 million Americans will lose unemployment benefits immediately in 2014, according to a report from National Employment Law Project. An additional 850,000 will lose them by the end of March. North Carolina just ran this policy experiment. Does Washington like what it sees?

http://www.bloomberg.com/news/2013-12-17/north-carolina-shows-how-to-crush-the-unemployed.html

165
General Discussion / The World's Newest Island, Niijima
« on: December 18, 2013, 03:56:22 PM »


The Earth is geologically dynamic. Mountains and oceans are created and destroyed over millions of years. Almost nothing is permanent on the face of the planet.

In a human lifespan, it's easy to ignore this reality. That is, until a volcano creates a new island.

In late November, a few days before Thanksgiving, an eruption began in the Pacific Ocean about 600 miles south of Tokyo in the Ogasawara Islands. Over the last few weeks, an island has formed at the volcanic site. People are calling the new land mass Niijima.
 
The island has an area of about 14 acres and it continues to grow. NASA's Earth Observatory released new images of it today.


Sometimes, these newly formed islands don't last. They erode away, or the sea floor sinks as the weight of the landmass piles up. But early signs are that Niijima will stick around, at least long enough for us to forget that it once never was.

167
When I was growing up my mom gave me a multivitamin every day as a defense against unnamed dread diseases.

But it looks like Mom was wasting her money. Evidence continues to mount that vitamin supplements don't help most people, and can actually cause diseases that people are taking them to prevent, like cancer.


Three studies published Monday add to multivitamins' bad rap. One review found no benefit in preventing early death, heart disease or cancer. Another found that taking multivitamins did nothing to stave off cognitive decline with aging. A third found that high-dose multivitamins didn't help people who had had one heart attack avoid another.

"Enough is enough," declares an editorial accompanying the studies in Annals of Internal Medicine. "Stop wasting money on vitamin and mineral supplements."

But enough is not enough for the American public. We spend $28 billion a year on vitamin supplements and are projected to spend more. About 40 percent of Americans take multivitamins, the editorial says.

Even people who know about all these studies showing no benefit continue to buy multivitamins for their families. Like, uh, me. They couldn't hurt, right?

If only it was as simple as popping a supplement and being set for life. But alas, no.

More Evidence Against Vitamin D To Build Bones In Middle Age
In most cases, no. But $28 billion is a lot to spend on a worthless medical treatment. So I called up Steven Salzberg, a professor of medicine at Johns Hopkins who has written about Americans' love affair with vitamins, to find out why we're so reluctant to give up the habit.

"I think this is a great example of how our intuition leads us astray," Salzberg told Shots. "It seems reasonable that if a little bit of something is good for you, them more should be better for you. It's not true. Supplementation with extra vitamins or micronutrients doesn't really benefit you if you don't have a deficiency."

Vitamin deficiencies can kill, and that discovery has made for some great medical detective stories. Salzberg points to James Lind, a Scottish physician who proved in 1747 that citrus juice could cure scurvy, which had killed more sailors than all wars combined. It was not until much later that scientists discovered that the magic ingredient was vitamin C.

Ads often tout dietary supplements and vitamins as "natural" remedies. But studies show megadoses of some vitamins can actually boost the risk of heart disease and cancer, warns Dr. Paul Offit.

A Scientist Debunks The 'Magic' Of Vitamins And Supplements
Lack of vitamin D causes rickets. Lack of niacin causes pellagra, which was a big problem in the southern U.S. in the early 1900s. Lack of vitamin A causes blindness. And lack of folic acid can cause spina bifida, a crippling deformity.

Better nutrition and vitamin fortified foods have made these problems pretty much history.

Now when public health officials talk about vitamin deficiencies and health, they're talking about specific populations and specific vitamins. Young women tend to be low on iodine, which is key for brain development in a fetus, according to a 2012 report from the Centers for Disease Control and Prevention. And Mexican-American women and young children are more likely to be iron deficient. But even in that group, we're talking about 11 percent of the children, and 13 percent of the women.

Recent studies have shown that too much beta carotene and vitamin E can cause cancer, and it's long been known that excess vitamin A can cause liver damage, coma and death. That's what happened to Arctic explorers when they ate too much polar bear liver, which is rich in vitamin A.

"You need a balance," Salzberg says. But he agrees with the Annals editorial — enough already. "The vast majority of people taking multivitamins and other supplemental vitamins don't need them. I don't need them, so I stopped."

I'm still struggling with the notion that mother didn't know best. But maybe when the current bottle of chewable kid vitamins runs out, I won't buy more.

http://www.npr.org/blogs/health/2013/12/17/251955878/the-case-against-multivitamins-grows-stronger?utm_content=socialflow&utm_campaign=nprfacebook&utm_source=npr&utm_medium=facebook

168
General Discussion / Replication in Science: Backlash & Back-Backlash
« on: December 17, 2013, 01:29:06 PM »
TL;DR: high-profile cancer researcher/biologist Mina Bissell argues in a Nature op-ed that failures to replicate are usually due to improper/non-identical procedure, that failure to replicate is not really very convincing evidence that a phenomenon isn't genuine, and that replication's generally a costlier process than the movement pushing for wider emphasis on replicability of studies would have us believe. Statistician, political scientist, and replication advocate Andrew Gelman picks up on and replies to Bissell's piece at length on his blog, arguing that she's wrong, is being unhelpfully defensive, and that most of the problems she describes with replications are really problems with the limited descriptions given by researchers of their methods.

Bissell poo-poos replication:

Quote
Every once in a while, one of my postdocs or students asks, in a grave voice, to speak to me privately. With terror in their eyes, they tell me that they have been unable to replicate one of my laboratory's previous experiments, no matter how hard they try. Replication is always a concern when dealing with systems as complex as the three-dimensional cell cultures routinely used in my lab. But with time and careful consideration of experimental conditions, they, and others, have always managed to replicate our previous data.

Articles in both the scientific and popular press1–3 have addressed how frequently biologists are unable to repeat each other's experiments, even when using the same materials and methods. But I am concerned about the latest drive by some in biology to have results replicated by an independent, self-appointed entity that will charge for the service. The US National Institutes of Health is considering making validation routine for certain types of experiments, including the basic science that leads to clinical trials4. But who will evaluate the evaluators? The Reproducibility Initiative, for example, launched by the journal PLoS ONE with three other companies, asks scientists to submit their papers for replication by third parties, for a fee, with the results appearing in PLoS ONE. Nature has targeted5 reproducibility by giving more space to methods sections and encouraging more transparency from authors, and has composed a checklist of necessary technical and statistical information. This should be applauded.

So why am I concerned? Isn't reproducibility the bedrock of the scientific process? Yes, up to a point. But it is sometimes much easier not to replicate than to replicate studies, because the techniques and reagents are sophisticated, time-consuming and difficult to master. In the past ten years, every paper published on which I have been senior author has taken between four and six years to complete, and at times much longer. People in my lab often need months — if not a year — to replicate some of the experiments we have done on the roles of the microenvironment and extracellular matrix in cancer, and that includes consulting with other lab members, as well as the original authors.

People trying to repeat others' research often do not have the time, funding or resources to gain the same expertise with the experimental protocol as the original authors, who were perhaps operating under a multi-year federal grant and aiming for a high-profile publication. If a researcher spends six months, say, trying to replicate such work and reports that it is irreproducible, that can deter other scientists from pursuing a promising line of research, jeopardize the original scientists' chances of obtaining funding to continue it themselves, and potentially damage their reputations.

Fair wind
Twenty years ago, a reproducibility movement would have been of less concern. Biologists were using relatively simple tools and materials, such as pre-made media and embryonic fibroblasts from chickens and mice. The techniques available were inexpensive and easy to learn, thus most experiments would have been fairly easy to double-check. But today, biologists use large data sets, engineered animals and complex culture models, especially for human cells, for which engineering new species is not an option.

Many scientists use epithelial cell lines that are exquisitely sensitive. The slightest shift in their microenvironment can alter the results — something a newcomer might not spot. It is common for even a seasoned scientist to struggle with cell lines and culture conditions, and unknowingly introduce changes that will make it seem that a study cannot be reproduced. Cells in culture are often immortal because they rapidly acquire epigenetic and genetic changes. As such cells divide, any alteration in the media or microenvironment — even if minuscule — can trigger further changes that skew results. Here are three examples from my own experience.

My collaborator, Ole Petersen, a breast-cancer researcher at the University of Copenhagen, and I have spent much of our scientific careers learning how to maintain the functional differentiation of human and mouse mammary epithelial cells in culture. We have succeeded in cultivating human breast cell lines for more than 20 years, and when we use them in the three-dimensional assays that we developed6, 7, we do not observe functional drift. But our colleagues at biotech company Genentech in South San Francisco, California, brought to our attention that they could not reproduce the architecture of our cell colonies, and the same cells seemed to have drifted functionally. The collaborators had worked with us in my lab and knew the assays intimately. When we exchanged cells and gels, we saw that the problem was in the cells, procured from an external cell bank, and not the assays.

Another example arose when we submitted what we believe to be an exciting paper for publication on the role of glucose uptake in cancer progression. The reviewers objected to many of our conclusions and results because the published literature strongly predicted the prominence of other molecules and pathways in metabolic signalling. We then had to do many extra experiments to convince them that changes in media glucose levels, or whether the cells were in different contexts (shapes) when media were kept constant, drastically changed the nature of the metabolites produced and the pathways used8.

A third example comes from a non-malignant human breast cell line that is now used by many for three-dimensional experiments. A collaborator noticed that her group could not reproduce its own data convincingly when using cells from a cell bank. She had obtained the original cells from another investigator. And they had been cultured under conditions in which they had drifted. Rather than despairing, the group analysed the reasons behind the differences and identified crucial changes in cell-cycle regulation in the drifted cells. This finding led to an exciting, new interpretation of the data that were subsequently published9.

Repeat after me
The right thing to do as a replicator of someone else's findings is to consult the original authors thoughtfully. If e-mails and phone calls don't solve the problems in replication, ask either to go to the original lab to reproduce the data together, or invite someone from their lab to come to yours. Of course replicators must pay for all this, but it is a small price in relation to the time one will save, or the suffering one might otherwise cause by declaring a finding irreproducible.

When researchers at Amgen, a pharmaceutical company in Thousand Oaks, California, failed to replicate many important studies in preclinical cancer research, they tried to contact the authors and exchange materials. They could confirm only 11% of the papers3. I think that if more biotech companies had the patience to send someone to the original labs, perhaps the percentage of reproducibility would be much higher.

It is true that, in some cases, no matter how meticulous one is, some papers do not hold up. But if the steps above are taken and the research still cannot be reproduced, then these non-valid findings will eventually be weeded out naturally when other careful scientists repeatedly fail to reproduce them. But sooner or later, the paper should be withdrawn from the literature by its authors.

One last point: all journals should set aside a small space to publish short, peer-reviewed reports from groups that get together to collaboratively solve reproducibility problems, describing their trials and tribulations in detail. I suggest that we call this ISPA: the Initiative to Solve Problems Amicably.

http://www.nature.com/news/reproducibility-the-risks-of-the-replication-drive-1.14184

Andrew Gelman comments & replies on his blog:

Quote
Raghuveer Parthasarathy pointed me to an article in Nature by Mina Bissell, who writes, “The push to replicate findings could shelve promising research and unfairly damage the reputations of careful, meticulous scientists.”

I can see where she’s coming from: if you work hard day after day in the lab, it’s gotta be a bit frustrating to find all your work questioned, for the frauds of the Dr. Anil Pottis and Diederik Stapels to be treated as a reason for everyone else’s work to be considered guilty until proven innocent.

That said, I pretty much disagree with Bissell’s article, and really the best thing I can say about it is that I think it’s a good sign that the push for replication is so strong that now there’s a backlash against it. Traditionally, leading scientists have been able to simply ignore the push for replication. If they are feeling that the replication movement is strong enough that they need to fight it, that to me is good news.

I’ll explain a bit in the context of Bissell’s article. She writes:

Articles in both the scientific and popular press have addressed how frequently biologists are unable to repeat each other’s experiments, even when using the same materials and methods. But I am concerned about the latest drive by some in biology to have results replicated by an independent, self-appointed entity that will charge for the service. The US National Institutes of Health is considering making validation routine for certain types of experiments, including the basic science that leads to clinical trials.

But, as she points out, such replications will be costly. As she puts it:

Isn’t reproducibility the bedrock of the scientific process? Yes, up to a point. But it is sometimes much easier not to replicate than to replicate studies, because the techniques and reagents are sophisticated, time-consuming and difficult to master. In the past ten years, every paper published on which I have been senior author has taken between four and six years to complete, and at times much longer. People in my lab often need months — if not a year — to replicate some of the experiments we have done . . .

So, yes, if we require everything to be replicate, it will reduce the resources that are available to do new research.

Replication is always a concern when dealing with systems as complex as the three-dimensional cell cultures routinely used in my lab. But with time and careful consideration of experimental conditions, they [Bissell's students and postdocs], and others, have always managed to replicate our previous data.

If all science were like Bissell’s, I guess we’d be in great shape. In fact, given her track record, perhaps we could some sort of lifetime seal of approval to the work in her lab, and agree in the future to trust all her data without need for replication.

The problem is that there appear to be labs without 100% successful replication rates. Not just fraud (although, yes, that does exist); and not just people cutting corners, for example, improperly excluding cases in a clinical trial (although, yes, that does exist); and not just selection bias and measurement error (although, yes, these do exist too); but just the usual story of results that don’t hold up under replication, perhaps because the published results just happened to stand out in an initial dataset (as Vul et al. pointed out in the context of imaging studies in neuroscience) or because certain effects are variable and appear in some settings and not in others. Lots of reasons. In any case, replications do fail, even with time and careful consideration of experimental conditions. In that sense, Bissell indeed has to pay for the sins of others, but I think that’s inevitable: in any system that is less than 100% perfect, some effort ends up being spent on checking things that, retrospectively, turned out to be ok.

Later on, Bissell writes:

The right thing to do as a replicator of someone else’s findings is to consult the original authors thoughtfully. If e-mails and phone calls don’t solve the problems in replication, ask either to go to the original lab to reproduce the data together, or invite someone from their lab to come to yours. Of course replicators must pay for all this, but it is a small price in relation to the time one will save, or the suffering one might otherwise cause by declaring a finding irreproducible.

Hmmmm . . . maybe . . . but maybe a simpler approach would be for the authors of the article to describe clearly (with videos, for example, if that is necessary to demonstrate details of lab procedure) in the public record.

After all, a central purpose of scientific publication is to communicate with other scientists. If your published material is not clear—if a paper can’t be replicated without emails, phone calls, and a lab visit—this seems like a problem to me! If outsiders can’t replicate the exact study you’ve reported, they could well have trouble using your results in future research. To put it another way, if certain findings are hard to get, requiring lots of lab technique that is nowhere published—and I accept that this is just the way things can be in modern biology—then these findings won’t necessarily apply in future work, and this seems like a serious concern.

To me, the solution is not to require e-mails, phone calls, and lab visits—which, really, would be needed not just for potential replicators but for anyone doing further research in the field—but rather to expand the idea of “publication” to go beyond the current standard telegraphic description of methods and results, and beyond the current standard supplementary material (which is not typically a set of information allowing you to replicate the study; rather, it’s extra analyses needed to placate the journal referees), to include a full description of methods and data, including videos and as much raw data as is possible (with some scrambling if human subjects is an issue). No limits—whatever it takes! This isn’t about replication or about pesky reporting requirements, it’s about science. If you publish a result, you should want others to be able to use it.

Of course, I think replicators should act in good faith. If certain aspects of a study are standard practice and have been published elsewhere, maybe they don’t need to be described in detail in the paper or the supplementary material; a reference to the literature could be enough. Indeed, to the extent that full descriptions of research methods are required, this will make life easier for people to describe their setups in future papers.

Bissell points out that describing research methods isn’t always easy:

Twenty years ago . . . Biologists were using relatively simple tools and materials, such as pre-made media and embryonic fibroblasts from chickens and mice. The techniques available were inexpensive and easy to learn, thus most experiments would have been fairly easy to double-check. But today, biologists use large data sets, engineered animals and complex culture models . . . Many scientists use epithelial cell lines that are exquisitely sensitive. The slightest shift in their microenvironment can alter the results — something a newcomer might not spot. It is common for even a seasoned scientist to struggle with cell lines and culture conditions, and unknowingly introduce changes that will make it seem that a study cannot be reproduced. . . .

If the microenvironment is important, record as much of it as you can for the publication! Again, if it really takes a year for a study to be reproduced, if your finding is that fragile, this is something that researchers should know about right away from reading the article.

Bissell gives an example of “a non-malignant human breast cell line that is now used by many for three-dimensional experiments”:

A collaborator noticed that her group could not reproduce its own data convincingly when using cells from a cell bank. She had obtained the original cells from another investigator. And they had been cultured under conditions in which they had drifted. Rather than despairing, the group analysed the reasons behind the differences and identified crucial changes in cell-cycle regulation in the drifted cells. This finding led to an exciting, new interpretation of the data that were subsequently published.

That’s great! And that’s why it’s good to publish all the information necessary so that a study can be replicated. That way, this sort of exciting research could be done all the time

Costs and benefits

The other issue that Bissell is (implicitly) raising is a cost-benefit calculation. When she writes of the suffering caused by declaring a finding irreproducible, I assume that ultimately she’s talking about a patient who will get sick or even die because some potential treatment never gets developed or never becomes available because some promising bit of research got dinged. On the other hand, when research that is published in a top journal but does not hold up, this can waste thousands of hours of researchers’ time, spending resources that otherwise could have been used on productive research.

Indeed, even when we talk about reporting requirements, we are really talking about tradeoffs. Clearly writing up one’s experimental protocol (and maybe including a Youtube) and setting up data in archival form, that takes work, it represents time and effort that could otherwise be spent on research (or evan on internal replication). On the other hand, when methods and data are not clearly set out in the public record, this can result in wasted effort by lots of other labs, following false leads as they try to figure out exactly how the experiment was done.

I can’t be sure, but my guess is that, for important, high-profile research, on balance it’s a benefit to put all the details in the public record. Sure, that takes some effort by the originating lab, but it might save lots more effort for each of dozens of other labs that are trying to move forward from the published finding.

Here’s an example. Bissell writes:

When researchers at Amgen, a pharmaceutical company in Thousand Oaks, California, failed to replicate many important studies in preclinical cancer research, they tried to contact the authors and exchange materials. They could confirm only 11% of the papers. I think that if more biotech companies had the patience to send someone to the original labs, perhaps the percentage of reproducibility would be much higher.

I worry about this. If people can’t replicate a published result, what are we supposed to make of it? If the result is so fragile that it only works under some conditions that have never been written down, what is the scientific community supposed to do with it?

And there’s this:

It is true that, in some cases, no matter how meticulous one is, some papers do not hold up. But if the steps above are taken and the research still cannot be reproduced, then these non-valid findings will eventually be weeded out naturally when other careful scientists repeatedly fail to reproduce them. But sooner or later, the paper should be withdrawn from the literature by its authors.

Yeah, right. Tell it to Daryl Bem.

What happened?

I think that where Bissell went wrong is by thinking of replication in a defensive way, and thinking of the result being to “damage the reputations of careful, meticulous scientists.” Instead, I recommend she take a forward-looking view, and think of replicability as a way of moving science forward faster. If other researchers can’t replicate what you did, they might well have problems extending your results. The easier you make it for them to replicate, indeed the more replications that people have done of your work, the more they will be able, and motivated, to carry on the torch.

Nothing magic about publication

Bissell seems to be saying that if a biology paper is published, it should be treated as correct, even if outsiders can’t replicate it, all the way until the non-replicators “consult the original authors thoughtfully,” send emails and phone calls, and “either to go to the original lab to reproduce the data together, or invite someone from their lab to come to yours.” After all of this, if the results still don’t hold up, they can be “weeded out naturally from the literature”—but, even then, only after other scientists “repeatedly fail to reproduce them.”

This seems pretty clear: you need multiple failed replications, each involving thoughtful conversation, email, phone, and a physical lab visit. Until then, you treat the published claim as true.

OK, fine. Suppose we accept this principle. How, then, do we treat an unpublished paper? Suppose someone with a Ph.D. in biology posts a paper on Arxiv (or whatever is the biology equivalent), and it can’t be replicated? Is it ok to question the original paper, to treat it as only provisional, to label it as unreplicated? That’s ok, right? I mean, you can’t just post something on the web and automatically get the benefit of the doubt that you didn’t make any mistakes. Ph.D.’s make errors all the time (just like everyone else).

Now we can engage in some salami slicing. According to Bissell (as I interpret here), if you publish an article in Cell or some top journal like that, you get the benefit of the doubt and your claims get treated as correct until there are multiple costly, failed replications. But if you post a paper on your website, all you’ve done is make a claim. Now suppose you publish in a middling journal, say, the Journal of Theoretical Biology. Does that give you the benefit of the doubt? What about Nature Neuroscience? PNAS? Plos-One? I think you get my point. A publication in Cell is nothing more than an Arxiv paper that happened to hit the right referees at the right time. Sure, approval by 3 referees or 6 referees or whatever is something, but all they did is read some words and look at some pictures.

It’s a strange view of science in which a few referee reports is enough to put something into a default-believe-it mode, but a failed replication doesn’t count for anything. Bissell is criticizing replicators for not having long talks and visits with the original researchers, but the referees don’t do any emails, phone calls, or lab visits at all! If their judgments, based simply on reading the article, carry weight, then it seems odd to me to discount failed replications that are also based on the published record.

My view that we should focus on the published record (including references, as appropriate) is not legalistic or nitpicking. I’m not trying to say: Hey, you didn’t include that in the paper, gotcha! I’m just saying that, if somebody reads your paper and can’t figure out what you did, and can only do that through lengthy emails, phone conversations, and lab visits, then this is going to limit the contribution your paper can make.

As C. Glenn Begley wrote in a comment:

A result that is not sufficiently robust that it can be independently reproduced will not provide the basis for an effective therapy in an outbred human population. A result that is not able to be independently reproduced, that cannot be translated to another lab using what most would regard as standard laboratory procedures (blinding, controls, validated reagents etc) is not a result. It is simply a ‘scientific allegation’.

To which I would add: Everyone would agree that the above paragraph applies to an unpublished article. I’m with Begley that it also applies to published articles, even those published in top journals.

A solution that should make everyone happy

Or, to put it another way, maybe Bissell is right that if someone can’t replicate your paper, it’s no big deal. But it’s information I’d like to have. So maybe we can all be happy: all failed replications can be listed on the website of the original paper (then grumps and skeptics like me will be satisfied), but Bissell and others can continue to believe published results on the grounds that the replications weren’t careful enough. And, yes, published replications should be held to the same high standard. If you fail to replicate a result and you want your failed replication to be published, it should contain full details of your lab setup, with videos as necessary.

http://andrewgelman.com/2013/12/17/replication-backlash/

169
Agrulian Archives / General Relativity (undergrad/grad student level)
« on: December 17, 2013, 01:13:57 PM »
Brief Subject Overview: generalization of special relativity, due to Einstein. Models gravitation as the literal geometry of the world, and imposes the speed of light as a limit on the speed of objects (if begun at a lower speed than that of light). If I understand correctly (probably I do not), special relativity argues that all physical laws are invariant across non-accelerating reference frames, as opposed to Newtonian mechanics which only said something to this effect about Newton's first law. General relativity then further generalizes this, insisting that all physical law is invariant in all reference frames, including accelerating (non-inertial) frames.

Text(s): MTW's "Gravitation" OR Carroll's "Spacetime & Geometry" OR Schutz's "A First Course in General Relativity" OR Taylor & Wheeler's "Spacetime Physics" (not sure which of these I feel most comfortable with yet).

Assigned problems:

TBA

Replies will contain worked solutions, discussion, etc.

170
Agrulian Archives / Procedural Texturing (undergrad student level)
« on: December 17, 2013, 01:01:55 PM »
Brief Subject Overview: procedural texturing is study of the automated creation of interesting and/or realistic 2D/3D graphics, landscapes, etc. Has been used in, for example, Minecraft and various Minecraft clones.

Text(s): Ebert et al's "Texturing & Modeling".

Assigned problems:

TBA (not sure this has any problems in it, don't remember any. Might just have to work through each code example as it comes)

Replies will contain worked solutions, discussion, etc.

171
NITIN JAIN is the big man in Lasalgaon, a dusty town a day’s drive from Mumbai that boasts it has Asia’s biggest onion market. With a trim moustache and a smartphone stuck to his ear he struts past a thousand-odd tractors and trucks laden with red onions. Farmers hurl armfuls at his feet to prove their quality. A gaggle of auctioneers, rival traders and scribes follow him, squabbling and yanking each other’s hair. Asked why onion prices have risen so much, Mr Jain relays the question to the market. “Why?” he bellows. His entourage laughs. He says that the price of India’s favourite vegetable is a mystery that no calculation can explain.

High food prices perturb some men and women even bigger than Mr Jain. Raghuram Rajan, the boss of India’s central bank, is grappling with high inflation caused in large part by food prices: wholesale onion prices soared by 278% in the year to October and the retail price of all vegetables shot up by 46%. The food supply chain is decades out of date and cannot keep up with booming demand. India’s rulers are watching the cost of food closely, too, ahead of an election due by May. Electoral folklore says that pricey onions are deadly for incumbent governments.

A year ago it seemed that India had bitten the bullet by permitting foreign firms to own majority stakes in domestic supermarkets. The decision came after a fierce political battle. Walmart, Carrefour and Tesco have been waiting for years to invest in India. They say they would revolutionise shopping. Only 2-3% of groceries are bought in formal stores, with most people reliant on local markets. They would also modernise logistics chains, either by investing themselves, or indirectly, by stimulating food producers to spend on factories, warehouses and trucks, and establish direct contracts with farmers, eliminating layers of middlemen.

On the ground little has happened. Foreign firms complain of hellish fine print, including a stipulation to buy from tiny suppliers. Individual Indian states can opt out of the policy—which is unhelpful if you want to build a national supermarket chain. In October Walmart terminated its joint venture with Bharti, an Indian group. India has reduced the beast of Bentonville to a state of bewilderment. Tesco has cut expatriate staff.

The reaction from politicians has been indifference. “We have liberalised…to the extent that we can. People have to accept this and decide whether they want to invest,” said Palaniappan Chidambaram, India’s finance minister. Despite the apparently obvious benefits of supermarkets and the experience of most other countries, few Indians seem to want change.

You’re not in Bentonville anymore

Just how bad is India’s food supply chain? To find out The Economist followed the journey of an onion from a field in the heart of onion country, in western India, to a shopping bag in Mumbai, a city of 18m onion-munchers. The trip suggests an industry begging for investment and reform.

“The system hasn’t changed much—it’s been the same since the 1970s,” says Punjaram Devkar, an elderly farmer in a white cap. For generations his forefathers have grown onions near a hamlet called Karanjgaon. He owns a crudely irrigated six-hectare (14-acre) plot, larger than the national average farm of just 1.2 hectares. He does not want to buy more land; unreliable electricity and labour mean “it is too hard to manage.” There are four onion crops each year—in a good season production is three times higher than in a bad one. To hedge his bets he also grows sugar cane. Costs have soared because of rising rural wages, which have doubled in three years. He says welfare schemes have made workers lazy. “They just play cards all day.”

Storage facilities amount to a wooden basket inside a shed—at this time of year onions perish within 15 days, although the variety grown in the spring can last eight months. From here one of Mr Devkar’s finest is thrown into a small trailer, along with the produce of nearby farms, and taken to Lasalgaon. The roads are mostly paved but the 32km (19-mile) journey takes a couple of hours in a rickety old tractor.

Lasalgaon neophytes will find their eyes water upon entering its market, a huge car park full of red onions, trucks and skinny farmers. Although the auction is run twice daily by an official body, it doesn’t look wholly transparent. Some farmers complain that Mr Jain and another trader dominate the trade (Mr Jain denies this). Prices vary wildly day by day and according to size and quality, which are judged in a split second by eye. The average price today is $0.33 per kilo.

Neither traders nor farmers agree why prices have risen so steeply of late. They blame climate change, the media, too much rain last year, too little rain this year, labour costs, an erratic export regime. “Our biggest problem is illiteracy,” says one farmer. “We don’t know how to use technology.” Most folk agree that India needs better cold storage but worry that it is too pricey or that it ruins the taste of onions.

Farmers must pay a 1% fee to the auction house and a 4% commission to the traders. Sometimes they also have to stump up for fees for packing and loading. That takes place at several depots surrounding the market where farmers must drop off their loads and pour them onto tarpaulins on the ground. The onions may wait there for days but once put into hessian sacks they are loaded onto trucks operated by separate haulage firms and owned by intricate webs of independent consortiums.

At 8pm Prabhakar Vishad, a 20-year veteran of the onion-express highway from Lasalgaon to Mumbai, climbs into a battered Tata truck with “Blow Horn” painted in big letters on the back. Over the years the roads have improved and power steering has made life easier. Still, it is dangerous work, says Mr Vishad, who had a bad crash last year. By 6am next morning he sets his bloodshot eyes on Vashi market on the outskirts of Mumbai. It handles 100-150 truckloads of onions a day—enough to satisfy India’s commercial capital.

Onions are sometimes unpacked, sorted and repacked, with wastage rates of up to 20%. By 9am the market is a teeming maze of 300-odd selling agents, who mainly act on behalf of middlemen, and several thousand buyers—who are either retailers or sub-distributors. Everyone stands ankle deep in onions of every size. The bidding process is opaque. The selling agents each drape a towel on their arm. To make a bid you stick your hand under the towel and grip their hand, with secret clenches denoting different prices. Average prices today are about $0.54 per kilo. If the seller likes your tickles you hail a porter. He carries your newly bought sacks on his head to a dispatch depot where another group of couriers takes them into the city.

“I’m crazy, like the guys you see in the movies. I don’t negotiate,” declares Sanjay Pingle. One of the market’s biggest agents, he charges the seller a 6.5% commission. The buyers pay loading charges on top of that and a fee to the market. He says business is tough—bad debts from customers run at a fifth of sales and he has to pay interest rates of 22% on his own debts. The solution to the onion shortage is obvious, he says. “In China they keep things in storage facilities—if India had the same facilities as China has, prices would be lower.” He says he has seen photographs of Chinese technology on his mobile phone.

By the afternoon thousands of cars and trucks are picking up small batches of onions to take them into Mumbai. In Chembur, a middle-class neighbourhood, Anburaj Madar runs a big sub-distributor. He handles 200 sacks a day which he sells to retailers and restaurants. He buys daily from Vashi market and has space to store only about 12 hours’ worth of stock. Rent is dear and he too reckons cold storage destroys the flavour of onions. He marks up his prices by perhaps 20% but says a chunk of what he buys has to be thrown away—it is either damaged or of inferior quality.

For the onions that do make the cut the next stop is a small shop down the road where they are sold for another mark-up of 10% or so. From here Indubai Kakdi is hand-selecting onions with elaborate care. Buck-toothed and ragged, she sells seven kilos a day from a wooden barrow; she makes a 10% margin. She says climate change has made prices more volatile.

Peeling back the layers of truth

The journey of an onion from Mr Devkar’s field to the end customer in Mumbai takes only a few days but is enough to make you weep. There are some underlying reasons why prices have risen—higher rural wages have pushed up farmers’ costs. But the system is horribly fiddly. Farms are tiny with no economies of scale. The supply chain involves up to five middlemen. The onion is loaded, sorted or repacked at least four times. Wastage rates, either from damage or weight loss as onions dry out, are a third or more. Because India has no modern food-processing industry, low-quality onions that could be turned into paste or sauces are thrown away. Retail prices are about double what farmers receive, although the lack of any standard grading of size or quality makes comparisons hard.

The system is volatile as well as inefficient. Traders who buy onions from farmers may hoard them, but for the supply chain as a whole far too little inventory is stored. As a result small variations in demand and supply are amplified and cause violent swings in price. In the first week of December 2013 prices fell again.

It is easy to see how heavy investment by supermarket chains and big food-producers—whether Indian or foreign—could make a difference. They would cut out layers from the supply chain, build modern storage facilities and probably prod farmers to consolidate their plots.

The shoppers of Chembur agree that Indian onions are the world’s tastiest but are fed up with price swings. No one mentions reform as a solution and until there is popular support and political leadership it is hard to see much changing. And what of the last stage of the onion’s odyssey, to the stomach? By one stall stands an elderly lady who says she likes the vegetable so much that she doesn’t bother to cook it. Instead she chomps on raw onions as if they were apples. At least someone has an eye on efficiency.

http://www.economist.com/news/business/21591650-walmart-carrefour-and-tesco-have-been-knocking-indias-door-without-much-luck-route

172
This thread's going to deal with examining the proofs of specific theorems / propositions / lemmata of interest in various subjects. Everything from "shit I learned but never appreciated in detail in freshman calculus" to "differentiable topology on Yau-Calabi manifolds expressed as Fourier series known to lie in the complexity class PLS" is fair game. Here's the current list, organized by subject (crossed-off indicates "already done in a reply", as in other threads):

Elementary & Vector Calculus
- Differential & Integral forms of the Chain Rule
- Theorem for Differentiating Inverses of Fxns
- univariate Taylor's Theorem
- multivariate Taylor's Theorem
- L'Hopital's Theorem
- Fundamental Theorem of Calculus
- Green's Theorem
- Stokes Theorem
- Divergence Theorem
- Clairaut's Theorem
- Chain of Variables Theorem(s)

Number Theory
- "Product Rule" for Finite Summations (? unsure of categorization, is calculus-like)

General Algorithms & Computational Complexity
- PLS=NP implies NP=co-NP
- weak Nash PPAD-completeness
- strong Nash FIXP-completeness
- NP ≠ co-NP implies no NP-Complete problem in co-NP & no co-NP complete problem in NP
- Integer programming w/ a Unitary Matrix is in P
- Cook's Theorem (direct proof that an NP-complete problem exists)
- Prime factorization: Shor's algorithm, Schnorr-Seysen-Lenstra algorithm, AKS primality test showing PRIMES is in P
- Hierarchy Theorems
- Approximation Algorithms: PCP Theorem
- Time/space complexity of Euclid's algorithm

Optimization
- Karush-Kuhn-Tucker conditions: basic sufficient conditions for their use
- KKT conditions: invexity is the broadest class of relevant fxns
- Stochastic Programming is Irrational (d/n respect Stochastic Dominance?)
- Simplex Algorithm convergence
- Simplex Algorithm exponential worst-case complexity
- Simplex polynomial smoothed complexity
- Ellipsoid Method Polynomial-time Convergence
- Convex Programming Polynomial-time Convergence
- No Free Lunch Theorem
- Benders Decomposition convergence proof

Decision Theory & Mathematical Psychology
- Cumulative-prospect Theory respects 1st-order Stochastic Dominance

Stochastic Processes
- Under ??? conditions (irreducibility?) a Markov Chain has Steady-state Distribution
- MCMC converges to Arbitrary Distribution

Logic
- Godel's Completeness Theorems
- Godel's Incompleteness Theorems

Econ & Finance
- Debreu's Existence Theorem
- No Trade Theorem

Game Theory
- Existence for: Nash, Bayes-Nash, Perfect, Trembling-Hand, etc equilibria
- 3+ player rational games may have only irrational solutions
- 2 player rational games always have rational solutions
- Revelation Principle

Graph Theory
- Four-color Theorem
- Sperner's Lemma

Fixed-Point Theorems
- Tarski's Fixed Point Theorem
- Banach's Fixed Point Theorem
- Brouwer's Fixed Point Theorem
- Kakutani's Fixed Point Theorem
- C1 and CN versions of Rouche's Theorem
- Schauder Fixed Point Theorem

Stochastic Calculus
- Ito's Lemma

Topology
- Tychonoff's Theorem
- Borsuk-Ulam Theorem (relatedly: Brouwer's Theorem)
- Baire Category Theorem
- Jordan Curve Theorem
- Ham Sandwich Theorem
- Poincaré conjecture

Convex Analysis[/i]
- Separating Hyperplane Theorem
- Farkas' Lemma

Fourier Analysis
- "Large Classes" of Fxns can be Fourier Expanded

Functional Analysis
- Metric spaces are completable
- Normed spaces are completable
- Hahn-Banach Theorem
- Banach Fixed Point Theorem
- Open Mapping Theorem (Functional Analysis)
- Closed Graph Theorem
- Baire Category Theorem

Analysis
- Implicit Fxn Theorem

ODEs and PDEs
- Poincare-Bendixon Theorem
- Hartman-Grobman Theorem

Chaos Theory
- (In)Equivalenc2es b/w Defns of Chaos in Elaydi
- Horseshoe Map is Chaotic (Wiggins)
- Shift Map is Chaotic (Wiggins)

Probability & Measure
- Central Limit Theorem(s)
- Strong Law of Large Numbers

Algebra & Galois Theory
- Classification/Factorization Theorems for (Finite) Abelian Groups
- Abel-Ruffini Theorem (quintc poly's do not have closed-form solutions in general)
- Examples/proofs of Non-factorization Domains
- Lagrange's Theorem (in group theory)
- sylow Theorems

Complex Analysis
- Integral Theorems (whatever they're called, the ones that say a circular path integral gives 0 or sum weird sum involving 2 pi i depending on how many badly behaved points the fxn has)
- Open Mapping Theorem
- C1 and CN versions of Rouche's Theorem
- Proof of Euler's CIS (cos + i sin) Formula

Statistical Inference
- Ugly Duckling Theorem
- Some sorta general proof that regularization can be done w/ optimization or priors

Physics
- Gauss's Law

173
Agrulian Archives / Unresolved Questions
« on: December 15, 2013, 01:19:35 PM »
Gonna use this thread to keep track of which problems I attempt and do not finish, or skip entirely out of laziness. Also questions that arise in the course of reading an argument of proof that I don't fully resolve. Hopefully will encourage me to return to them at a later date:

Electromagnetism
- Electromagnetism, Purcell & Morin, Ch. 1, # 7. (asks for me to write some code; was feeling too lazy to do this)

Chaos
- Chaos Theory, Elaydi, Sec. 1.4-1.5, # 4. (stuck on a sub-part of the proof of asymptotic instability of one of the equilibrium points. Had a neat idea for solving it but haven't made it work in practice yet)

Functional Analysis
- Functional Analysis, Kreyzsig, Sec. 1.6 # 5(b). (haven't figured this one out; asks for an example of homeomorphic spaces, but one complete while the other incomplete.)

Classical Mechanics
- Classical Mechanics, Taylor, Ch. 1, # 43 a). I solved problem fully, but got stuck during one of the two approaches---the more rigorous and potentially satisfying of the two, imo---that I took in trying to derive the polar unit vector hat(phi). There's something I'm missing about how to use the normalization condition to jump from where I got to the final result.

- Classical Mechanics, Taylor, Ch. 2, # 50. I 'solved the problem' in the sense Taylor would have expected of an undergrad in his class, I think, but my justification for interchanging differentiation and infinite summation was just a quick reference to a theorem from real/complex/metric analysis, wasn't carefully detailed, and I didn't prove it independently, nor even prove that the preconditions (uniform convergence or infinite radius of convergence?) held for e^z, just kind of said they did. Shouldn't be too hard to go back later and cite/prove a relevant, specific result, and establish its premises; for completeness of understanding I think I should do so.

- in Taylor's Classical Mechanics, he differentiates a function f(y') by y and claims the result is 0; similarly he differentiates f(y) by y' and gets 0. But y clearly determines y', and the form of y' constraints the form of y implicitly as well, so why don't we have to get all chain rule on this bitch?

Smooth Manifolds
- Lee2, Ch. 1, Lem. 1.6: every top n-mani has a cntble basis of precompact coord balls. At the end of p. 8, how do we know the collection of inv imgs of elems of B gives us precompact elems in M? Probably easy but need to write it out, can't see it immediately.

- Lee2, Ch. 1, in-chapter prob # 1.2. Sketched some ideas on how to show 2nd countability & Hausdorff-ness of RPn, but I left a lot of detail unclear. Specifically --- Hausdorff q's: what epsilon should be chosen for the form of the sets I defined in the Hausdorff argument? And are those sets actually open, or do I need to perturb each component to get an open set? 2nd count q's: I think I guessed the correct basis, but I didn't go to any effort to show that it worked, and it has some similar problems to the Hausdorff argument, i.e. it relies on perturbing elements of reps of linear 1-d subspaces. Show rigorously that this does what it should do? Seems like a common theme is I could use a clean formal test for equivalence between the linear subspaces rep'd by a vector; I think just such a test is the "can multiply by a nonzero constant" property, i.e. this is a characterization of reps for a 1-d lin subspace.

Topological Manifolds
- Lee1, Ch. 2, Ex 2.28: got a little lazy and identified the discontinuity in arcsin graphically. Also didn't really show injectivity or surjectivity, just kind of stated them as well-known properties of the relevant trig fxns and complex exponential map, resp. Should return and show this stuff analytically.

Calculus of Variations
- Fox, Ch. 2, Sec 4, proof of Lemma 2. Fox wants to argue that the product (non-integral) term in a certain integration-by-parts vanishes; the product includes terms of the form t(b)^2/u(b) and t(a)^2/u(a), and we know the numerators are 0, but nowhere does Fox show that u(a), u(b) are nonzero. Later in the chapter he proves that u(x) "cannot have a double root," by which I think he means that only u(x)=0 or u'(x)=0 can be true, not both, for any given x. He remarks that this shows that "both t(b)^2/u(b) and t(a)^2/u(a) vanish since t(a) = t(b) = 0 by hypothesis." I have no idea why he thinks what he's shown implies what he's said it's shown; maybe there's some kind of Taylor expansion & limiting argument being made? He's generally pretty good about spelling out the details though, so I wonder if I'm not just making a stupid oversight.

174
Agrulian Archives / Quick Tutorials on Subject Basics
« on: December 14, 2013, 10:01:48 PM »
Gonna fill this thread up with videos/wikis/articles/etc that provide accessible ways for remembering/understanding/deriving various elementary facts. This video on deriving the most commonly used values in the unit circle, for example:


175
Spamalot / I feel the love. And I feel it burn
« on: December 14, 2013, 05:01:38 PM »
song is about chlamydia

176
Agrulian Archives / The Arbitrary Mathematical Vomit Thread
« on: December 13, 2013, 02:21:05 AM »
Documenting whimsical observations / conversations 'bout math here.

177
General Discussion / Machine Learning for Adaptive Drugs
« on: December 13, 2013, 12:43:43 AM »
Computer scientists at the Harvard School of Engineering and Applied Sciences (SEAS) and the Wyss Institute for Biologically Inspired Engineering at Harvard University have joined forces to put powerful probabilistic reasoning algorithms in the hands of bioengineers.

In a new paper presented at the Neural Information Processing Systems conference on December 7, Ryan P. Adams and Nils Napp have shown that an important class of artificial intelligence algorithms could be implemented using chemical reactions.

These algorithms, which use a technique called "message passing inference on factor graphs," are a mathematical coupling of ideas from graph theory and probability. They represent the state of the art in machine learning and are already critical components of everyday tools ranging from search engines and fraud detection to error correction in mobile phones.

Adams' and Napp's work demonstrates that some aspects of artificial intelligence (AI) could be implemented at microscopic scales using molecules. In the long term, the researchers say, such theoretical developments could open the door for "smart drugs" that can automatically detect, diagnose, and treat a variety of diseases using a cocktail of chemicals that can perform AI-type reasoning.

"We understand a lot about building AI systems that can learn and adapt at macroscopic scales; these algorithms live behind the scenes in many of the devices we interact with every day," says Adams, an assistant professor of computer science at SEAS whose Intelligent Probabilistic Systems group focuses on machine learning and computational statistics. "This work shows that it is possible to also build intelligent machines at tiny scales, without needing anything that looks like a regular computer. This kind of chemical-based AI will be necessary for constructing therapies that sense and adapt to their environment. The hope is to eventually have drugs that can specialize themselves to your personal chemistry and can diagnose or treat a range of pathologies."

Adams and Napp designed a tool that can take probabilistic representations of unknowns in the world (probabilistic graphical models, in the language of machine learning) and compile them into a set of chemical reactions that estimate quantities that cannot be observed directly. The key insight is that the dynamics of chemical reactions map directly onto the two types of computational steps that computer scientists would normally perform in silico to achieve the same end.

This insight opens up interesting new questions for computer scientists working on statistical machine learning, such as how to develop novel algorithms and models that are specifically tailored to tackling the uncertainty molecular engineers typically face. In addition to the long-term possibilities for smart therapeutics, it could also open the door for analyzing natural biological reaction pathways and regulatory networks as mechanisms that are performing statistical inference. Just like robots, biological cells must estimate external environmental states and act on them; designing artificial systems that perform these tasks could give scientists a better understanding of how such problems might be solved on a molecular level inside living systems.

"There is much ongoing research to develop chemical computational devices," says Napp, a postdoctoral fellow at the Wyss Institute, working on the Bioinspired Robotics platform, and a member of the Self-organizing Systems Research group at SEAS. Both groups are led by Radhika Nagpal, the Fred Kavli Professor of Computer Science at SEAS and a Wyss core faculty member. At the Wyss Institute, a portion of Napp's research involves developing new types of robotic devices that move and adapt like living creatures.

"What makes this project different is that, instead of aiming for general computation, we focused on efficiently translating particular algorithms that have been successful at solving difficult problems in areas like robotics into molecular descriptions," Napp explains. "For example, these algorithms allow today's robots to make complex decisions and reliably use noisy sensors. It is really exciting to think about what these tools might be able to do for building better molecular machines."

Indeed, the field of machine learning is revolutionizing many areas of science and engineering. The ability to extract useful insights from vast amounts of weak and incomplete information is not only fueling the current interest in "big data," but has also enabled rapid progress in more traditional disciplines such as computer vision, estimation, and robotics, where data are available but difficult to interpret. Bioengineers often face similar challenges, as many molecular pathways are still poorly characterized and available data are corrupted by random noise.

Using machine learning, these challenges can now be overcome by modeling the dependencies between random variables and using them to extract and accumulate the small amounts of information each random event provides.

"Probabilistic graphical models are particularly efficient tools for computing estimates of unobserved phenomena," says Adams. "It's very exciting to find that these tools map so well to the world of cell biology."

http://www.sciencedaily.com/releases/2013/12/131212160349.htm?utm_source=feedburner&utm_medium=email&utm_campaign=Feed%3A+sciencedaily%2Fcomputers_math%2Fstatistics+%28ScienceDaily%3A+Computers+%26+Math+News+--+Statistics%29

178
Brief Subject Overview: a number of phase transitions are familiar to us from everyday experience: freezing, melting, evaporation, etc. There are others but to be blunt I don't know much about them, though it's one of the areas of physics I am most interested in appreciating. My dim understanding is that in the classical theory of phase transitions there arose a number of limits of quantities that diverged, and no one could quite figure out how to sensibly force them to converge; the renormalization group theory is what ultimately corrected this issue. The text I'll be using is about this theory.

Text(s): Zinn-Justin's "Phase Transitions and Renormalization Group".

Assigned problems: TBA

Replies will contain worked solutions, discussion, etc.

179
Agrulian Archives / Classical Field Theory (??? level)
« on: December 12, 2013, 09:49:07 PM »
Brief Subject Overview: classical field theory concerns the interaction of physical fields with matter. The text I'll be using focuses on fluid dynamics and elastic deformations, and on the gravitational and electromagnetic fields; it is both non-relativistic and non-quantum.

Text(s): Soper's "Classical Field Theory".

Assigned problems: TBA

Replies will contain worked solutions, discussion, etc.

180
Agrulian Archives / Abstract Algebra & Galois Theory (grad student level)
« on: December 12, 2013, 02:16:16 PM »
Brief Subject Overview: abstract algebra identifies the properties that make elementary algebra on the real numbers work as it does, and uplifts these properties to the status of axioms; it then explores the consequences of various combinations of these properties. Central objects of study are groups, rings, fields, etc.; straight-forward questions of algebraic interest include, for example, whether and when a polynomial equation has a solution in some underlying set (the reals being the most familiar example) on which we're capable of doing algebra. Abstract algebra intertwines with many other aras of higher mathematics, as, for example, in the definition of Lie groups, which combines ideas from differential topology and abstract algebra. So far I've taken a one-semester course in this subject, and hope to take a second semester of it in the spring; however, IIRC we won't be covering much of chapter 13 (field theory) or any of chapter 14 (galois theory), and I've always wanted to at least understand the proof that quintic polynomials don't have a closed-form solution (which I believe has something to do with galois theory), so I'd like to teach this subarea to myself. We will be covering chapter 9 (polynomials over fields) in the spring, so the plan of study will be a bit redundant with that, but there's nothing like repetition to breed mastery.

Text(s): Dummit & Foote's "Abstract Algebra". Focus on chapters 9, 13, 14.

Assigned problems:

Ch9~
13,14,15,16,17 p. 298-299
4,11 p. 301-303
1,2,4,5 p. 306-307
4,7,11,17 p. 311-313
1,5,6,7 p. 315
6, 8, 13, 22, 29, 33, 34, 35, 43 p. 330-335

Ch13~
1,5,7,8 p. 519
3,7,12,20 p. 529-531
1,2,4 p. 535-536
1,2,5,6 p. 545
1,6,10,11 p. 551-552
1,2,3,9,10 p. 555-557

Ch14~
1,3,5,7,9 p. 566-567
2,5,7,20,31 p. 581-585
8,11,12,15,17 p. 589-591
3,5,7,8 p. 595-596
6,10,11,12,13 p. 603-606
13, 15, 18, 33, 37, 38, 44, 51 p. 617-624
2,3,4,5 p. 635-639
2,5,6 p. 644-645
2,8,12,14,15 p. 652-654

Replies will contain worked solutions, discussion, etc.

Pages: 1 2 3 4 5 [6] 7 8 9 10 11 ... 22