You are all dreams and we are happy to know you, as you are nice dreams. We are an asexual autistic trans-feminine plural system with a label collection.
44 stories
·
5 followers

Doctors tried to lower $148K cancer drug cost; makers triple price of pill

5 Comments and 10 Shares

A drug that treats a variety of white blood cell cancers typically costs about $148,000 a year, and doctors can customize and quickly adjust doses by adjusting how many small-dose pills of it patients should take each day—generally up to four pills. At least, that was the case until now.

Last year, doctors presented results from a small pilot trial hinting that smaller doses could work just as well as the larger dose—dropping patients down from three pills a day to just one. Taking just one pill a day could dramatically reduce costs to around $50,000 a year. And it could lessen unpleasant side-effects, such as diarrhea, muscle and bone pain, and tiredness. But just as doctors were gearing up for more trials on the lower dosages, the makers of the drug revealed plans that torpedoed the doctors’ efforts: they were tripling the price of the drug and changing pill dosages.

The drug, ibrutinib (brand name Imbruvica), typically came in 140mg capsules, of which patients took doses from 140mg per day to 560mg per day depending on their cancer and individual medical situation. (There were also 70mg capsules for patients taking certain treatment combinations or having liver complications.) The pills treat a variety of cancers involving a type of white blood cell called B cells. The cancers include mantle cell lymphoma, which was approved for treatment with four 140mg pills per day, and chronic lymphocytic leukemia, approved to be treated with three 140mg pills per day. Each 140mg pill costs somewhere around $133—for now.

Imbruvica’s makers, Janssen and Pharmacyclics, have now gotten approval to sell four different tablets of varying strengths: 140mg, 280mg, 420mg, and 560mg. But the new pills will all be the same price—around $400 each—even the 140mg dose pill. The makers will stop selling the old, cheaper 140mg pill within three months, according to a report by the Washington Post.

The plan nixes any chance to lower costs with lower dosages. Even if patients can drop down to just 140mg a day, they’ll pay three times what they pay now for each 140mg pill.

In a statement to the Post, Janssen and Pharmacyclics explained the move by saying the new line-up is “a new innovation to provide patients with a convenient one pill, once-a-day dosing regimen and improved packaging, with the intent to improve adherence to this important therapy.” They noted that those taking 560mg a day will save money with the new pricing.

But doctors balked at what they saw as an underhanded move. In an interview with the Post, oncologist Mark Ratain of the University of Chicago Medicine put things bluntly: “That got us kind of pissed off.”

Ratain and colleagues wrote a commentary in the weekly newsletter Cancer Letters this month, decrying the price hike and new pill series, calling it “highly unusual.” In addition to thwarting efforts to help lower treatment costs, the doctors pointed out that the new dosage lineup will make it harder to nimbly adjust patients’ doses by simply advising them to take different numbers of pills each day. Switching a patient from a 280mg or 420mg per day dose down to 140mg will require paperwork, filling a new prescription, and having patients return unused pills—a process that can drag out for weeks. And increasing a patient’s dose would either be just as lengthy of a process or risk multiplying their treatment costs even further by doubling or tripling the pills each day.

In their commentary, titled in part “Sales Revenues at the Potential Expense of Patient Safety,” the doctors lay out examples of when quick dosage changes would be necessary. Those include when a patient needs to drop down while they’re on a short course of antibiotics or to adjust for new combination-cancer treatments. “Any putative convenience advantage of taking one pill a day is negated by the marked inconvenience to the patient of having to return pills every time there is a need for a dosage change,” they write.

Ratain and colleagues end with a call to the Food and Drug Administration to look into the matter, “given that it creates a barrier to optimal prescribing for some patients,” they write. “We further urge the FDA to recognize that the combination of the high price per pill and the flat pricing scheme are specific impediments to safe administration, and that ignoring the marketing approach for ibrutinib is antithetical to fostering optimally safe dosing and administration.”

Read the whole story
Irenes
31 days ago
reply
Technicalleigh
34 days ago
reply
SF Bay area, CA (formerly ATL)
Share this story
Delete
4 public comments
cak0705
23 days ago
reply
Seems like and example of increasing costs to justify increasing costs. #followthemoney
jhamill
35 days ago
reply
This coming after the Goldman Sachs guy asking if curing disease was a viable business model it is clear the Free Market and Capitalism does NOT value people's lives. I get that you're in business to make a profit. But, you're in HEALTHCARE to care for people and sometimes that means you have to take a loss to HELP people.

My gut reaction to this would be a bill that would criminally punish CEOs and companies that raise drug prices.
California
quad
33 days ago
When are the "sometimes" firms should take losses? How does this reasoning not apply to every drug?
dnorman
36 days ago
reply
as someone who may be looking at needing ibrutinib, fuck every single thing about this. recover your R&D costs, sure, but don't price the damned drug out of reach of patients.
Calgary
satadru
36 days ago
reply
And thus endeth any future research into probing the lower end of the therapeutic range of drugs still under patent protection. There's just no longer any incentive to improve patient outcomes by reducing price any more.
New York, NY

Securing Elections

3 Shares

Elections serve two purposes. The first, and obvious, purpose is to accurately choose the winner. But the second is equally important: to convince the loser. To the extent that an election system is not transparently and auditably accurate, it fails in that second purpose. Our election systems are failing, and we need to fix them.

Today, we conduct our elections on computers. Our registration lists are in computer databases. We vote on computerized voting machines. And our tabulation and reporting is done on computers. We do this for a lot of good reasons, but a side effect is that elections now have all the insecurities inherent in computers. The only way to reliably protect elections from both malice and accident is to use something that is not hackable or unreliable at scale; the best way to do that is to back up as much of the system as possible with paper.

Recently, there have been two graphic demonstrations of how bad our computerized voting system is. In 2007, the states of California and Ohio conducted audits of their electronic voting machines. Expert review teams found exploitable vulnerabilities in almost every component they examined. The researchers were able to undetectably alter vote tallies, erase audit logs, and load malware on to the systems. Some of their attacks could be implemented by a single individual with no greater access than a normal poll worker; others could be done remotely.

Last year, the Defcon hackers' conference sponsored a Voting Village. Organizers collected 25 pieces of voting equipment, including voting machines and electronic poll books. By the end of the weekend, conference attendees had found ways to compromise every piece of test equipment: to load malicious software, compromise vote tallies and audit logs, or cause equipment to fail.

It's important to understand that these were not well-funded nation-state attackers. These were not even academics who had been studying the problem for weeks. These were bored hackers, with no experience with voting machines, playing around between parties one weekend.

It shouldn't be any surprise that voting equipment, including voting machines, voter registration databases, and vote tabulation systems, are that hackable. They're computers -- often ancient computers running operating systems no longer supported by the manufacturers -- and they don't have any magical security technology that the rest of the industry isn't privy to. If anything, they're less secure than the computers we generally use, because their manufacturers hide any flaws behind the proprietary nature of their equipment.

We're not just worried about altering the vote. Sometimes causing widespread failures, or even just sowing mistrust in the system, is enough. And an election whose results are not trusted or believed is a failed election.

Voting systems have another requirement that makes security even harder to achieve: the requirement for a secret ballot. Because we have to securely separate the election-roll system that determines who can vote from the system that collects and tabulates the votes, we can't use the security systems available to banking and other high-value applications.

We can securely bank online, but can't securely vote online. If we could do away with anonymity -- if everyone could check that their vote was counted correctly -- then it would be easy to secure the vote. But that would lead to other problems. Before the US had the secret ballot, voter coercion and vote-buying were widespread.

We can't, so we need to accept that our voting systems are insecure. We need an election system that is resilient to the threats. And for many parts of the system, that means paper.

Let's start with the voter rolls. We know they've already been targeted. In 2016, someone changed the party affiliation of hundreds of voters before the Republican primary. That's just one possibility. A well-executed attack that deletes, for example, one in five voters at random -- or changes their addresses -- would cause chaos on election day.

Yes, we need to shore up the security of these systems. We need better computer, network, and database security for the various state voter organizations. We also need to better secure the voter registration websites, with better design and better internet security. We need better security for the companies that build and sell all this equipment.

Multiple, unchangeable backups are essential. A record of every addition, deletion, and change needs to be stored on a separate system, on write-only media like a DVD. Copies of that DVD, or -- even better -- a paper printout of the voter rolls, should be available at every polling place on election day. We need to be ready for anything.

Next, the voting machines themselves. Security researchers agree that the gold standard is a voter-verified paper ballot. The easiest (and cheapest) way to achieve this is through optical-scan voting. Voters mark paper ballots by hand; they are fed into a machine and counted automatically. That paper ballot is saved, and serves as a final true record in a recount in case of problems. Touch-screen machines that print a paper ballot to drop in a ballot box can also work for voters with disabilities, as long as the ballot can be easily read and verified by the voter.

Finally, the tabulation and reporting systems. Here again we need more security in the process, but we must always use those paper ballots as checks on the computers. A manual, post-election, risk-limiting audit varies the number of ballots examined according to the margin of victory. Conducting this audit after every election, before the results are certified, gives us confidence that the election outcome is correct, even if the voting machines and tabulation computers have been tampered with. Additionally, we need better coordination and communications when incidents occur.

It's vital to agree on these procedures and policies before an election. Before the fact, when anyone can win and no one knows whose votes might be changed, it's easy to agree on strong security. But after the vote, someone is the presumptive winner -- and then everything changes. Half of the country wants the result to stand, and half wants it reversed. At that point, it's too late to agree on anything.

The politicians running in the election shouldn't have to argue their challenges in court. Getting elections right is in the interest of all citizens. Many countries have independent election commissions that are charged with conducting elections and ensuring their security. We don't do that in the US.

Instead, we have representatives from each of our two parties in the room, keeping an eye on each other. That provided acceptable security against 20th-century threats, but is totally inadequate to secure our elections in the 21st century. And the belief that the diversity of voting systems in the US provides a measure of security is a dangerous myth, because few districts can be decisive and there are so few voting-machine vendors.

We can do better. In 2017, the Department of Homeland Security declared elections to be critical infrastructure, allowing the department to focus on securing them. On 23 March, Congress allocated $380m to states to upgrade election security.

These are good starts, but don't go nearly far enough. The constitution delegates elections to the states but allows Congress to "make or alter such Regulations". In 1845, Congress set a nationwide election day. Today, we need it to set uniform and strict election standards.

This essay originally appeared in the Guardian.

Read the whole story
Irenes
34 days ago
reply
Share this story
Delete

Pioneering Psychologist Hans Asperger Was a Nazi Sympathizer Who Sent Children to Be Killed, New Evidence Suggests

1 Comment and 4 Shares

The term “Asperger’s syndrome” will never be heard the same way again, owing to new research showing that Hans Asperger—the Austrian pediatrician for whom the disorder was named—was an active participant in the Nazi eugenics program, recommending that patients deemed “not fit for life” be sent to a notorious…

Read more...

Read the whole story
Technicalleigh
36 days ago
reply
Fuck him. And because it can't be said often or strongly enough, fuck Nazis.
SF Bay area, CA (formerly ATL)
Irenes
35 days ago
reply
Share this story
Delete

There Is an Area of Plastic in the Ocean That’s Three Times the Size of France. This 23-Year-Old Thinks He Can Clean It Up.

1 Comment and 2 Shares

On Wednesday, among the industrial warehouses and abandoned buildings on Alameda Island, just south of Oakland, California, a small team of engineers began the early stages of constructing “System 001.” Its bland name obscures the fact that System 001 is actually a first-of-its-kind, 2,000-foot long device intended to rid the ocean of its trillions of pieces of plastic. This week, the system took one baby step toward finally being ocean-ready. 

It’s designed sort of like an enormous and porous shower curtain: One long, U-shaped black polyethylene tube will attach to a nylon screen hanging underwater, while the entire device drifts across the ocean, using currents to collect plastic before the debris is ultimately removed.

In its final form, the floating plastic-eater will be autonomous and powered by solar energy, meaning it will be controlled by algorithms and free to roam within the ocean’s plastic hotspots. And, its curtain-like design prevents marine animals from getting trapped, as they would in a net. 

It will be the longest ocean structure ever to be deployed, and the Dutch non-profit behind it, the Ocean Cleanup, hopes a swarm of these U-shaped tubes will remove half of the 1.6-million-square-kilometer Great Pacific Garbage Patch (consisting largely of plastics) floating between California and Hawaii in just five years—and 90 percent of accumulated ocean plastic by 2040.

The effort can not come soon enough; just last month, a study from the group, published in Nature‘s Scientific Reports, found that the patch is 16 times larger than previous estimates. On Wednesday, the boyish Ocean Cleanup CEO Boyan Slat said, “It’s the first time anyone is doing anything like this. So, it’s still very much a beta system…I’m sure there will still be things that will go wrong, but that’s why we’re doing it, really, to improve it—so eventually we can deploy an entire fleet of these systems.”

“System 001”

Jackie Flynn Mogensen

Despite its size and multi-million dollar price tag, the project has humble origins. Slat was just 16 years old when a diving trip in Greece inspired him to clean the world’s oceans. “I saw more plastic bags than fish,” he has said about the trip. He went on to eventually present his idea for a massive, plastic-collecting system in a TEDx talk in 2012, recruited a team in 2013, and in 2014, he and his team raised over $2 million in 100 days to fund the project. 

Four years and about $40 million in fundraising later, the 23-year-old Slat is finally seeing his vision come to life. “We’re ready to launch the world’s first ocean cleanup system,” he said Wednesday, standing in front of a blueprint of the contraption. “Which is being built right here.”

From Alameda, System 001 will be towed along the coast of California and then sent 240 nautical miles offshore, for a “dress rehearsal,” of sorts. The team will ensure the system functions properly, and look for any damage to the structure caused by towing or rough ocean conditions before deploying it to the Great Pacific Garbage Patch. In the end, if all goes well, 60 of these things will be floating around the Pacific.

Ocean Cleanup CEO Boyan Slat in front of a blueprint of System 001.

Jackie Flynn Mogensen

Cleaning the world’s oceans is no small task. The Great Pacific Garbage Patch alone is three times the size of France. And although the mass production of plastics began about six decades ago, according to the UN, the world now produces hundreds of millions of metric tons of plastic each year—with more than 8 million metric tons of it ending up in the ocean. By 2050, some scientists estimate the seas will hold more plastic than fish. 

That’s at least part of the reason why the project has seen its fair share of criticism. Roland Geyer, a professor of industrial ecology and green supply chain management at UC-Santa Barbara, tells Mother Jones an effort to clean the Great Pacific Garbage Patch is “somewhat pointless” because the vast majority of plastic in the ocean isn’t floating on the surface, and even as you work to clean it up, more plastic enters the ocean every day. “Their heart is in the right place,” he says about the Ocean Cleanup. “But I think their efforts could be better off elsewhere.” 

Despite the criticism, the group has continued raising funds and has shown no sign of altering course. If all preliminary tests go as planned, the team expects to launch their system inside the Great Pacific Garbage Patch, 1,200 nautical miles offshore, this summer, with hopes to bring the first haul of plastic in by the end of the year. The plastics can then be turned into and sold as products, like the Ocean Cleanup sunglasses pictured below. 

“Plastic doesn’t have to be ocean plastic pollution.” says Slat. “I think it’s really time to go clean it up.” 

The group plans to turn the pollution they collect into products, like these branded ocean-plastic sunglasses.

Jackie Flynn Mogensen

Read the whole story
Technicalleigh
35 days ago
reply
Awesome! Forget that "everything is bad so why even try" naysayer, I'm rooting for these folks.
SF Bay area, CA (formerly ATL)
Irenes
35 days ago
reply
Share this story
Delete

Election Security

1 Share

I joined a letter supporting the Secure Elections Act (S. 2261):

The Secure Elections Act strikes a careful balance between state and federal action to secure American voting systems. The measure authorizes appropriation of grants to the states to take important and time-sensitive actions, including:

  • Replacing insecure paperless voting systems with new equipment that will process a paper ballot;

  • Implementing post-election audits of paper ballots or records to verify electronic tallies;

  • Conducting "cyber hygiene" scans and "risk and vulnerability" assessments and supporting state efforts to remediate identified vulnerabilities.

    The legislation would also create needed transparency and accountability in elections systems by establishing clear protocols for state and federal officials to communicate regarding security breaches and emerging threats.

Read the whole story
Irenes
89 days ago
reply
Share this story
Delete

Novelty Requires Explanation

1 Comment and 3 Shares

Epistemic status: Reasonably confident, but I should probably try to back this up with numbers about how often elementary results actually do get missed.

Attention conservation notice: More than a little rambling.

Fairly regularly you see news articles about how some long-standing problem that has stumped experts for years has been solved, usually with some nice simple solution.

This might be a proof of some mathematical result, a translation of the Voynich algorithm, a theory of everything. Those are the main ones I see, but I’m sure there are many others that I don’t see.

These are almost always wrong, and I don’t even bother reading them any more.

The reason is this: If something is both novel and interesting, it requires an explanation: Why has nobody thought of this before?

Typically, these crackpot solutions (where they’re not entirely nonsensical) are so elementary that someone would surely have discovered it before now.

Even for non-crackpot ideas, I think this question is worth asking when you discover new. As well as being a useful validity check for finding errors and problems, if there is a good answer then it can often be enlightening about the problem space.

Potentially, it could also be used as a heuristic in the other direction: If you want to discover something new, look in places where you would have a good answer to this question.

There are a couple ways this can play out, but most of them boil down to numbers: If a lot of people have been working for a problem for a long time during which they could have discovered your solution, they probably would have. As nice as it would be to believe that we were uniquely clever compared to everyone else, that is rarely the case.

So an explanation basically needs to show some combination of:

  1. Why not many people were working on the problem
  2. Why the time period during which they could have discovered your technique in is small

The first is often a bad sign! If not many people work on the problem, it might not be very interesting.

This could also be a case of bad incentives. For example, I’ve discovered a bunch of new things about test case reduction, and I’m pretty sure most of that is because not many people work on test case reduction: It’s a useful tool (and I think the problem is interesting!), but it’s a very niche problem at a weird intersection of practical needs and academic research where neither side has much of a good incentive to work on it.

As a result, I wouldn’t be surprised an appreciable percentage of person-hours ever spent on test-case reduction were done by me! Probably not 10%, but maybe somewhere in the region of 1-5%. This makes it not very surprising for me to have discovered new things about it even though the end result is useful.

More often I find that I’m just interested in weird things that nobody else cares about, which can be quite frustrating and it can make it difficult to get other people excited about your novel thing. If that’s the case, you’re probably going to have a harder time marketing your novel idea than you are discovering it.

The more interesting category of problem is the second: Why have the people who are already working on this area not previously thought of this?

The easiest way out of this is simply incremental progress: If you’re building on some recent discovery then there just hasn’t been that much time for them to discover it, so you’ve got a reasonable chance of being the first to discover it!

Another way is by using knowledge that they were unlikely to have – for example, by applying techniques from another discipline with little overlap in practice with the one the problem is form. Academia is often surprisingly siloed (but if the problem is big enough and the cross-disciplinary material is elementary enough, this probably isn’t sufficient. It’s not that siloed).

An example of this seems to be Thomas Royen’s  recentish proof of the Gaussian Correlation Inequality (disclaimer: I don’t actually understand this work). He applied some fairly hairy technical results that few people working on the problem were likely to be familiar with, and as a result was able to solve something people had been working on for more than 50 years.

A third category of solution is to argue that everyone else had a good chance of giving up before finding your solution: e.g. If the solution is very complicated or involved, it has a much higher chance of being novel (and also a much higher chance of being wrong of course)! Another way this can happen is the approach looks discouraging in some way.

Sometimes all of these combine. For example, I think the core design of Hypothesis is a very simple, elegant, idea, that just doesn’t seem to have been implemented before (I’ve had a few people dismissively tell me they’ve encountered the concept before, but they never could point me to a working implementation).

I think there are a couple reasons for this:

  1. Property-based testing just doesn’t have that many people working on it. The number might top 100, but I’d be surprised if if topped 200 (Other random testing approaches could benefit from this approach, but not nearly as much. Property-based testing implements lots of tiny generators and thus feels many of the problems more acutely).
  2. Depending on how you count, there’s maybe been 20 years during which this design could have been invented.
  3. Simple attempts at this approach work very badly indeed (In a forthcoming paper I have a hilarious experiment in which I show that something only slightly simpler than what we do completely and totally fails to work on the simplest possible benchmark).

So there aren’t that many people working on this, they haven’t had that much time to work on it, and if they’d tried it it probably would have looked extremely discouraging.

In contrast I have spent a surprising amount of time on it (largely because I wanted to and didn’t care about money or academic publishing incentives), and I came at it the long way around so I was starting from a system I knew worked, so it’s not that surprising that I was able to find it when nobody else had (and does not require any “I’m so clever” explanations).

In general there is of course no reason that there has to be a good explanation of why something hasn’t been discovered before. There’s no hard cut off line where something goes from “logically must have been discovered” to “it’s completely plausible that you’re the first” (discontinuous functions don’t exist!), it’s just a matter of probabilities. Maybe it’s very likely that somebody hasn’t discovered it before, but maybe you just got lucky. There are enough novel things out there that somebody is going to get lucky on a fairly regular basis, it’s probably just best not to count on it being you.

PS. I think it very unlikely this point is novel, and I probably even explicitly got it from somewhere else and forgot where. Not everything has to be novel to be worthwhile.

Read the whole story
Irenes
96 days ago
reply
acdha
97 days ago
reply
Washington, DC
Share this story
Delete
1 public comment
mareino
96 days ago
reply
Great P.S.!
Washington, District of Columbia
Next Page of Stories