The 'show why science is awesome' thread:

Some are some are not. Depends one where they stand in their evolution.

Bacterial genomes average about 88% protein coding content. Compare that to 2% for human DNA.


But the smaller size of bacterial genomes, their fierce competitive environment and short generational span, the prevalence of horizontal gene transfer, and the plasticity of their genome (single-chromosome, circular, free-floating in the cytoplasm) mean that they are among the most evolved lifeforms on the planet. I suspect that the efficiency of their genome is probably near maximum.

Not science being awesome, or maybe awesome science uncovering rotten science?

From the last link (first chronologically):

First, what bothers me isn’t just that people said 5-HTTLPR mattered and it didn’t. It’s that we built whole imaginary edifices, whole castles in the air on top of this idea of 5-HTTLPR mattering. We “figured out” how 5-HTTLPR exerted its effects, what parts of the brain it was active in, what sorts of things it interacted with, how its effects were enhanced or suppressed by the effects of other imaginary depression genes. This isn’t just an explorer coming back from the Orient and claiming there are unicorns there. It’s the explorer describing the life cycle of unicorns, what unicorns eat, all the different subspecies of unicorn, which cuts of unicorn meat are tastiest, and a blow-by-blow account of a wrestling match between unicorns and Bigfoot.

This is why I start worrying when people talk about how maybe the replication crisis is overblown because sometimes experiments will go differently in different contexts. The problem isn’t just that sometimes an effect exists in a cold room but not in a hot room. The problem is more like “you can get an entire field with hundreds of studies analyzing the behavior of something that doesn’t exist”. There is no amount of context-sensitivity that can help this.

Speaking as a sometimes geneticist who’s been in practice since the late 90’s, I can say that there’s been a tremendous change in how the field works.

First, we gather many orders of magnitude more data than we used to - in the late 90’s, I was taking guesses at a few mutations on a few genes in a few hundred individuals. We now survey 10’s of thousands of individuals across their entire genome (millions of mutations.) Further, the manpower needed to generate this data has decreased, where it no longer takes an army to generate this data.

Second, we’ve moved from a few experimentalists who know some very basic statistics to a big data style analysis, where teams of software engineers, data scientists, mathematicians and statisticians work together to work with this huge blob of data. We’ve become far better at understanding ways that a study can be confounded, and even the lower level individuals in projects have a stronger statistical background. Projects are now interdisciplinary efforts that sometimes involve hundreds of researchers.

Third, I’d like to think that the statisticians “won” the war of keeping most people intellectually honest. P-value hacking 20 years ago wan’t cheating, it was a perfectly ok activity to “get some results”. Now it’s well known that these sorts of statistical cheats are invalid, and results will not stand the test of time. Additionally, because so many groups are generating huge swaths of data, it’s far easier to perform replication of results, so even when winner’s curse happens, it’s corrected more quickly.

https://freakonometrics.hypotheses.org/19817

So I am fairly convinced that modern genetics papers don’t make these sorts of errors. Gradually other fields should become a lot more rigorous in terms of their statistics (if they aren’t already: I’m sure particle physics usually gets it right!)

Still, it’s a bit disconcerting that whole fields have been founded and flourished on chance results, publication bias and poor statistical understanding. There is probably a lot of stuff that modern stats techniques would burn to the ground.

Which makes me wonder what other poorly-understood tools are the foundation for a lot of other results! The overall process will eventually correct these mistakes but the mistakes can last a long time!

Edit: Another nice line:

the whole point of studying is that, once you have done 450 studies on something, you should end up with more knowledge than you started with. In this case we ended up with less.

Publication bias is a bitch.

Imagine some other team is finding amazing results that the community accepts as real. You don’t find the same result. Are you right, or do you just decide you’re unlucky/shit at science, and you don’t publish your paper? Moreso, publishing negative results in the past was really hard to do. Papers devoted to replication became much less looked down on in the mid 2000’s.

OMG this a billion times.

A lot of organisms already extend the 20 amino acids to a degree. Humans have several different proteins that include selenocystein via a specific tweak to the genetic code. Other ‘odd’ amino acids are made by modifying a protein after translation.

Anyway, the ribosome could probably deal with an extended amino acid set - it just takes whatever the tRNA brings. What you would need to engineer are the aminoacyl synthetases. You could probably encode a few novel amino acids that way, as long as you stick within the constraints of the codon-anticodon pairing.

I’m not sure how much you could gain by doing all that - you’d need some way to figure out what the heck your new types of proteins can do in a cell. It would be a lot more profitable to just keep using the sort of mutagenesis / evolutionary selection / large-scale screening that is already being done.

https://www.nature.com/articles/d41586-019-01625-5?utm_source=Nature+Briefing&utm_campaign=567cff556e-briefing-wk-20190524&utm_medium=email&utm_term=0_c9dfd39373-567cff556e-42674083

Not sure why that is staying as just a text link.

Video filmed at four trillion frames per second captures light in a flash

Super-high-speed camera produces a film consisting of 60 consecutive frames.

At first the article made me wonder if it’s possible to develop a camera fast enough to record the passage of light through a vacuum. But, duh, you need a medium of some sort for the light to be visible.

I’m just super excited to see our precision improve so greatly. I really would love to know if there is a minimum amount of time and/or space such that you can’t get any faster or zoom in any further.

There are a number of theories, string theory chief among them, but we won’t know until our measurement ability gets way better than it is now.

This is sooo cool! I want a house (and roof) made out of this!

This is wonderful.

https://medicalxpress.com/news/2019-05-radio-wave-therapy-effective-liver-cancer.html

Yet also potentially a little scary. I’d always poo-pooed any RF (cell phone) - cancer connection because there seemed no mechanism for cells to respond to RF. But if calcium channels can “act like an antenna” to kill only cancer cells, what other teeny antennas are in us?

Meanwhile, how the heck did they discover the “cancer-specific, amplitude-modulated radiofrequency electromagnetic fields”, and what music does it play? https://youtu.be/UIVe-rZBcm4
Or Mozart?

Goddamn I’d play that to my liver any day :) excellent find @gruntled!

Kind of creepy.

I wonder how they fix this? Draw some energy and run that through toaster wires to heat the panels? :/

Eh, only 2%?