Wednesday, June 30, 2010
What with the summer in full swing, and the Tour de France about to start, TAE is feeling sporty. The last several weeks I've been back in the swing of riding my bike, and doing some evening yoga with my lovely bride.
But it seems like most of the sports I like, or even kinda like, are saturated with people. Last night at the spot where I was riding I passed or was passed by no less than a dozen riders.
So I decided last night that what I need is a new sport. It needs to have three basic criteria:
1. Not many people are doing it.
2. It must require intense physical exertion in short bursts. I'm not looking to get into ultra marathons.
3. It must not be expensive.
The Official Sport of TAE: Fartlekking.
Obviously, the funny name is the appeal here, but rather than a reference to flatulence, I am referencing the Swedish term for "Speed Play". Fartlek style training was developed in 1937 by Gosta Holmer as a way to boost endurance via aerobic and anaerobic interval training. A full explanation of fartlek-style interval training can be found here and here.
But while fartlek training sounds fun, the point there is training, not competition. TAE thinks that this could be easily adapted into a sport; fartlekking, in which participants race each other up and down hills in various terrains.
The Official Rules of Fartlekking:
Rule 1: Highest Total Points Wins.
Rule 2: An "up down" is defined as a single course that traverses from a point to another point. Typical "up downs" involve an altitude increase (and subsequent decrease) of no less than 100 feet and a distance of 1/4 linear mile. A single "up down" can be no longer than 1/2 linear mile in length, and can have a maximum altitude increase of 400 feet.
Rule 3: Competitor that reaches the endpoint of an individual "up down" gets three points. Second competitor to reach endpoint of an individual "up down" gets two points. Third competitor to reach the endpoint of an individual "up down" gets one point. All other competitors to reach end point of "up down" get zero points. Any competitor that does not successfully reach endpoint of an individual "up down" in the maximum time allotment gets a one point deduction.
Rule 4: No Blood Rule: Competitors shall be physically evaluated for blood at the end of each "up down". Any blood found shall cause that competitor to be deducted 1 point. Carryover blood from previous "up downs" that has already been penalized shall not be deducted again.
Rule 5: Weight bonus: Any competitor may carry extra weight with them during an "up down". The added mass shall be pre-weighed (or in known increments i.e. dumbbells) and the competitors weight shall also be recorded. A competitor may obtain a "weight bonus" for successfully completing an "up down" carrying extra weight. The bonus structure shall be as follows:
Carrying 5-9.9% of personal weight: 1 pt
Carrying 10-14.9% of personal weight: 2 pts
Carrying 15-24.9% of personal weight: 3 pts
Carrying 25% or more of personal weight: 4 pts
In this way, a competitor could theoretically earn more points for carrying a large amount of weight (and finishing within the time limit) than the competitor who carried no weight but finished in the least amount of time.
Rule 6: Water Rule: Water may only be considered "extra weight" for the weight bonus (see Rule 5) if the water is intact at the end of the "up down." For example, if a competitor started the "up down" carrying a gallon of water, but drank half of it, they'd only be awarded points for the half gallon of water they had not consumed. Similarly, if they drop and spill a container of water, they will not be awarded points for it at the end of the "up down."
Rule 7: Fartlekking paths may not follow pre-established hiking/mountain biking/walking trails for more than 50 consecutive feet. They may, however, follow game trails.
Rule 8: Fart Licking Penalty: Anyone who makes a joke about "fart-licking" is immediately deducted 3 points.
Rule 9: In the event of a tie at the end of series of "up down" runs, the two or more competitors that share a tie will compete in a single, sudden death "up down", fastest to the endpoint is the winner. No weight bonus shall be allowed during this "up down". No blood penalty shall be incurred.
Rule 10: Puke Bonus: Any competitor who vomits during an "up down" gets an automatic 2 point bonus if they successfully reach the endpoint by the time limit. However, that competitor is not allowed to compete in the next "up down."
The winner of the fartlekking session who has the most points at the end of the "up downs" is called The Fartlekker.
I think this could really catch on. It seems like it'd be a really good workout involving interval training, eccentric muscle contraction, and both aerobic and anaerobic exercise. The use of weight bonuses allows one to build strength beyond what they could carrying only themselves. The rugged terrain required makes this very outdoorsy, and it can be played in almost any weather condition, any time of year. In fact, rough weather makes it more intense.
Interval training like this is the best way to lower LDL levels, raise HDL levels, increase aerobic threshold while increasing anaerobic threshold, and is a great way to hang out with your friends.
See you on the up and up!
Tuesday, June 29, 2010
The idea of the ocean as a place of limitless bounty persists even now. That is folly, as evidenced by a long list of over-fished species -- cod, herring, and now tuna are but three examples -- whose populations have collapsed in locales all over the world. It ought to be the defining environmental issue of our time.
It is not.
Read the fantastic post here. And pray for your children.
Monday, June 28, 2010
And yet, here's news that the ESA (which stands for "Not Russian Space Agency) has successfully launched its 37th consecutive rocket into space.
TAE goes on and on about the retirement of the shuttles being a good thing. I also go on and on about how ludicrous and biased old retired astronauts sound when they pine for a return to the days of yore when NASA was sending them into space (and they just happened to be cultural icons).
And yet, with the Falcon 9 rocket successfully launched, and with ESA rockets going up...it seems America and its allies do have a lot of potential alternatives to "buying rides from the Commies". Food for thought, Neil.
THIS IS MY YEAR!!
Of course, it would be the year that we really can't afford to have atmospheric disturbances in an already oily Gulf...
Friday, June 25, 2010
Earlier this week a fun interview (and by 'fun' I mean 'depressing') of Frank Fenner was conducted by The Australian. In it, Fenner outlines how humans are doomed, and exactly how soon he thinks our species (along with most others) will disappear completely.
"We're going to become extinct," the eminent scientist says. "Whatever we do now is too late."And then this gem:
Homo sapiens will become extinct, perhaps within 100 years," he says. "A lot of other animals will, too. It's an irreversible situation. I think it's too late. I try not to express that because people are trying to do something, but they keep putting it off."
If you ask me, lately, I'd have to agree with Fenner that it's too late. But conversely, I don't really see humanity going extinct unless this entire planet does; we are too adaptable and resilient to just die off.
I can imagine (and consider the imaginings quite feasible) a point where global chaos erupts and the human population rapidly diminishes to less than 5% of current. But even if some sort of supervirus emerged, a small percentage of the population would be immune. Will Smith's character, Dr. Robert Neville, in "I Am Legend" puts it pretty accurately: a deadly virus would kill about 90% of the population immediately, or 6.3 billion. Of the remaining, only 0.002 percent would be immune completely, the rest would probably be weakened enough by the virus/bacteria that other things would kill them.
So assuming the worst case: susceptibility to the virus/bacteria means death from the virus or from side-effects (think dehydration-death via diarrhea-inducing parasites), you're still left with a human population of 14,000,000. That's a lot of people. Sure, they'd be scattered in pockets (the genetic immunity would be hereditary and thus isolated groups of related people would all survive), so you would quickly have a scenario where the entire world collapsed into chaos, but small villages would survive. And suddenly they'd have a bounty of resources, making their destruction more difficult.
However, a 99.998% deadly virus is pretty damn unlikely. One of the deadliest virii of all time, responsible for the Spanish Flu Pandemic, had a mortality rate of 10-20%.
In any case, I've mentioned before that we could be far, far past the point of no return in terms of environmental instability. And that could make things real hard for our species. But I just don't see us dying off completely, as long as there is life on this planet.
Monday, June 21, 2010
My guess is quite a bit. Much of the oil will wash up on shores and stay there, or sink to the bottom and be slowly reclaimed by bacteria.
But what if...just what if...this is the beginning of the end for our global ecosystem? What if aliens will visit 2,000 years in the future and find a lifeless, oily tarball of a planet, and using their vast supercomputers they back-calculate and determine that the moment life on Earth was doomed was April 20, 2010 by the human calendar?
The thought that failure to produce a successful relief well might mean the Deepwater Horizon well might just spew oil into the oceans for the next ten years is a chilling thought.
The question, Moses said, is "Can we build a miniature Sun on Earth?" The recipe involves a peppercorn-size target of hydrogen isotopes deuterium and tritium heated to 200 million degrees Fahrenheit for a couple billionths of a second. To get that micro-blast of heat, the National Ignition Facility (NIF) uses lasers---coherent light---at a massive scale. Laser engineer Moses notes that photons are perfect for the job: "no mass, no charge, just energy."
Moses ran a dramatic video showing how a shot at the NIF works. 20-foot-long slugs of amplified coherent light (10 nanoseconds) travel 1,500 yards and converge simultaneously through 192 beams on the tiny target, compressing and heating it to fusion ignition, with a yield of energy 10 to 100 times of what goes into it. Successful early test shots suggest that the NIF will achieve the first ignition within the next few months, and that shot will be heard round the world.
TAE asks the question no one else does: where does Ed expect to get all that tritium? Ed claims that "1 liter of heavy water will yield the energy of 2 million gallons of gas."
But TAE knows from his own research into betavoltaics that suggesting that there is 1/2 liter of tritium (I'm assuming his tritium-deuterium mix is 50/50 for simplicity) on Earth borders on the absurd. Also, 2 million gallons of gas really isn't all that much. America uses 3 million gallons of gas a day, depending on who you ask.
Ed makes this hilariously absurd claim: " there is no such thing as peak hydrogen." So suddenly we're talking about hydrogen fusion...not deuterium and tritium? This is such a misleading comment. The problem is that no one has had any luck fusing pure hydrogen - which is why the LLNL team is using deuterium and tritium in their experiments. In fact, its pretty safe for me to say "the only being capable of sustained hydrogen fusion is God."
Nevertheless, Ed seems to think that in ten years we'll have nuclear fusion power plants chugging along at a fair clip, ushering in a new, clean era.
And yet, I have to keep asking: "Ed, where are you going to get all that tritium? Only 500 pounds of it have been made since 1955!" There is such a thing as peak tritium.
TAE reminds his audience that he is a huge proponent of fusion power. But the LLNL technique isn't the solution, rather it is a great method for data acquisition. A sustained nuclear fusion plant will need to either break the barrier and fuse hydrogen (protium), or we need to start mining helium-3 from the Moon, where it is fairly plentiful.
Thursday, June 17, 2010
But with the millions of large, high-speed rocks floating around our sun, we should expect that the number of them that impact Earth to be relatively high. Not so. It turns out that Jupiter, with it's massive gravity and its orbit that sits outside of ours, has acted like a bullet shield for us, sucking up massive numbers of meteorites and asteroids. It is probable, in fact, that life on Earth is simply a puny little quarterback, and the success of life here is simply because we had a massive offensive guard running block in front of us.
Tuesday, June 15, 2010
When you become uber-popular again, and everyone wants you to be a part of their scene, and they talk about how you've remade yourself like Robert Downey Jr., and they all say you are so hot again...remember that here at TAE we were undyingly loyal to you, and practically worshipped you, even through the worst of times.
Monday, June 14, 2010
Here, it is reported that scientists have successfully impregnated a nanowire into the lipid bilayer of a cell. You probably didn't hear about it on the local news, but this is big. Remember TAE's Law of Bionics: All you need is drivers.
TAE believes the great revolution in human-machine interfaces will come down not to some sort of innovative and breakthrough technology, but rather the smart people who come up with a way to code 1's and 0's into the language of the brain. Think of it like this: your brain is the computer box sitting next to you. When you plug your mouse into the computer, the computer needs a driver file to translate the data from the mouse into data that the computer can use. Lacking drivers, or if the wrong drivers are installed, and the mouse won't work, or will work incorrectly. Get a different mouse, and you better get new drivers.
The same holds true, I believe, for connecting machines to the brain. Machines work in discrete packets of digital information. Your central nervous system, however, does not. Nerves send pulses in different frequencies, different amplitudes, and different combinations in order to convey complex information to the brain.
The group of individuals who can develop a method to send machine-derived information to the brain via nerves will be very rich indeed. Imagine a USB port on your arm. Imagine being able to plug into your computer directly, and hear music directly in your brain. "Yeah right" you laugh.
But it's basically completely plausible. Right now, you receive audio information from your ears. The waves of sound cause little hairs to vibrate, which excites nerves. These nerves then carry the information to the brain, which "hears" the sounds. Why not send the same information from a different set of nerves? Why not send it through a USB port on your arm? Your brain, receiving the sound data, would assume it is hearing music, regardless of the source. You could potentially listen to music at any volume through your USB implant, as the actual environment would be completely silent, and your ears would not be harmed.
Or imagine if you could see behind you. Right now your eyes are on the front of your head, receiving photons, and translating them into nerve pulses that travel through your optic nerves into your brain. Why not send that data from an external device through a USB implant and get visual data that way? Really, you could "watch" anything, anytime. Augmented Reality would take on a whole new meaning if instead of overlaying an image in front of your eyes and then receiving that information through your eyes...instead you sent the AR information directly to the brain.
I know this sounds fanciful and ridiculous, but it brings us full round to the headline at the beginning of the post. If scientists can develop a method to create nerve-wire connections (that translate data) without causing cell death...we're a huge leap closer to having human-machine interfaces.
Now all we need is the drivers.
Sunday, June 13, 2010
I'll admit, you had me. Back in 2000, when I went off to college, I was deeply engrossed in the pop-punk scene. I had every Green Day, Blink 182, and Good Charlotte song memorized, and could play virtually any "powerchord punk" song by ear on my electric guitar. I even went to see Good Charlotte and Simple Plan live, I was so pop-punk. I saw Blink 182 at Sandstone Amphitheater, saw Green Day in St. Louis, and was so all over the first Fall Out Boy album, back when they were just poppy filth.
But by 2002, my summers had become beer-soaked country. You had reinvented yourself; my dad's genre of Oak Ridge Boys, Dolly, Reba, and George had become a fresh, vibrant genre of Kenny, Martina, Rascal Flatts, and other young talent. I drank in your lyrics about summer, about women and work and life, and really enjoyed it. Nothing, it seemed, sounded better at the end of a long, hot day than some good country music.
Meanwhile, pop music was suffering. Artists and recording companies were facing a serious problem of appeal, they sounded whiny beyond belief as they complained about Napster and illegal file sharing, and there was a vacuum of good talent.
But I have to tell you, Country Music, that today you may have lost me again. Normally I try to not listen to country from September through March, so that your sounds are fresh when I begin to tune in every Spring, and so that the short playlists of local country radio don't get tired and grate on me too much.
The last couple years, though, a strange trend has appeared. Country music has become a place where artists that have exhausted their original markets can retreat and sell a few more records. Last summer, out of nowhere, I had to listen to Jewel singing cliche tripe endlessly. Then, to make matters worse, some droll, repetitive Kid Rock song about smoking pot ruthlessly mined "Sweet Home Alabama" for a melody. It was as if the absurdly awful music of "Big and Rich" had opened a gateway to mid-90's pop trash, and they'd all been let through with a free pass and a guarantee of Top 40 airtime.
And then you let Hootie in on the party. Now, this has nothing to do with Hootie being a black man in a typically white genre; I hated his song long before I made the connection. But when I mentioned to my friend how much I hated that song, and he said "Darius Rucker is Hootie, you know?" it all made so much sad, tragic sense. But I tolerated their presence, like the ugly cousins at a reunion that drink too much and embarass themselves, because brilliance like Little Big Town and Miranda Lambert and Eric Church still dominated the scene.
And then this morning, on CMT, I saw Nickelback.
Yes, this is the same Nickelback that is less popular than a pickle. The same Nickelback who is so awful, so incredibly, logic-defyingly horrible that they are the example people use when they rail about modern rock and its appeal to stupid white-trash. And there, at number 18 on the CMT Top 20 Countdown, was Nickelback. Please.
I didn't even listen to the song, I had Anna Nalick and Panic at the Disco playing on Pandora and thank God the tv was muted. But as I watched, I couldn't help but feel like the last nail was on the coffin, and perhaps my September country music sabbatical must start early. As in immediately.
The problem, country music, is that if you continue to let pop artists treat country music like a place where they can exploit idiots who aren't tired of hearing their music elsewhere, then you will continue to be the disliked cousin at the music family reunion! Do you really feel like it's smart marketing to let Toby Keith sing another song about kicking the illegal immigrants out? Do you really think we want to hear Hootie sing another song that sounds just like his short and unexceptional career in pop music? Do you think we don't know who Jewel is? Are we expected to welcome her, when the rest of the genres clearly don't?
Meanwhile, pop music via Lady GaGa is experiencing a renaissance.
Nickelback? Seriously? You had me, Country Music. And by turning into "the music genre where we count on our audience to be total idiots," you've lost me.
Friday, June 11, 2010
The American Bird Conservancy estimated in 2003 that between 10,000 and 40,000 birds were killed each year at wind farms across the country, about 80 percent of which were songbirds and 10 percent birds of prey. "With the increased capacity over the last seven years, we now estimate that 100,000 – 300,000 birds are killed by wind turbines each year," said Conservancy spokesman Robert Johns. By our math, that comes to 274 to 822 birds a day killed by wind farms across the country.And then this nugget from another reader:
40,000 or 300,000 birds is a lot - but a tiny number compared to some other causes of death. The US Fish and Wildlife Service estimates (PDF) that somewhere between 97-196 million birds are killed annually by collisions with building windows.
So you're saying we need to sue the glass manufacturers? The glass manufacturers would probably just say we should blame the window sales companies. The window companies would say we need to blame the contractors who installed the windows. The Republicans would say that the window installer's union is the root cause. And eventually the glass manufacturers would argue that only a single bird has actual been killed by collisions with building windows.
GOD says to me with a kind of smile,
"Hey how would you like to be God awhile
And steer the world?"
"Okay," says I, "I'll give it a try.
Where do I set?
How much do I get?
What time is lunch?
When can I quit?"
"Gimme back that wheel," says GOD.
"I don't think you're quite ready YET."
Thursday, June 10, 2010
It occurred to me today that in the meantime, cities that wanted to reduce congestion at busy intersections should charge people to use those intersections. This would incentivize people taking alternate routes, or better yet, other forms of transportation.
Tuesday, June 8, 2010
1. Scientists have announced artificially made organisms. Obviously the plan hinges on the ability to produce anaerobic bacteria (or heavily modified algae) that can convert surface elements into other elements and release oxygen and ozone. It would potentially also work if the microbes converted the CO2 in the Martian atmosphere into oxygen and ozone. For years I have wondered just what the problem is (I really don't know the answer) that prevents scientists from making bacteria with new functions. For years we've had "Bt corn" which is corn that produces the bacillus thuringensis toxin, which is toxic to European corn borer. Specifically, genetic engineers have added a new trait to an existing organism. If a set of genes exists in a creature here on earth that allows it to live by mining its own oxygen from the rust in surrounding rocks, why can't we isolate those genes and add them to an algae?
2. Bacteria have been found in Canada that could possibly survive on Mars. As our exploration of our own planet continues, we are learning that bacteria can survive and thrive in harsher and harsher places. It is no surprise to me that scientists have found a bacteria that survives in a place here on Earth that has conditions potentially worse than conditions on Mars. Understanding the methods that bacteria uses to survive could be key to building a terraforming microbe to ship to Mars...
3. Falcon 9 successfully launched. Honestly, if a private corporation announced today that they had developed the above terraforming microbe, and wanted to terraform Mars and claim it for their own real estate/mining/emerging markets efforts, what currently is there to stop them? The simplest answer is that their access to space would be halted by governments, who currently regulate all flights into Earth orbit. But with the speech from President Obama that commercialization of space is a priority, and with the first successful launch of a privately owned spacecraft, one has to wonder: how far are we from a legal battle for space rights? If SpaceX landed a colony ship on the Moon...there is no existing entity on Earth that could stop them from claiming the Moon their property...other than the courts...
I am retreading old wheels, I know, but the point I want to make is this: a corporation with foresight and long-term planning strategies (and a butt-load of capital) might find the idea of terraforming Mars very prodigious. What better way to secure the future of their business than to create an entire planet of future customers? Especially if this planet reaches some sort of population cap, or worse, an ecological disaster or war eliminates much of this planet's population. This century might be the century of globalization, but perhaps smart corporations, armed with terraforming tools, might make the next century the one of solar systemization.
Lieutenant Reginald Barclay, in the Nth Degree episode of Star Trek: TNG.
Contrary to the article, however, I don't see this as a good application for satellites. I see it as the elimination of satellites. Why send a satellite to space to broadcast information when you could do it literally anywhere?
Sunday, June 6, 2010
Friday, June 4, 2010
Human evolution is peppered with such concepts. We wanted to talk upright, so we needed a different hip and spine structure. This cost us the ability to bend over and trot on all fours like gorillas often do. We wanted thicker legs for jogging, but in order to jog we had to give up massive shoulders and arms. We wanted big brains, but it cost us mandible strength. We wanted even bigger brains, but it meant we had to be born very early, rendering us helpless infants for longer than virtually all other creatures on the planet. And to feed that brain, herbivorism was just impossible, meaning protein-rich foods like meat had to be regularly obtained.
So while Douthat tries to reconcile the Millenials lack of empathy with their enhanced idealism, I see no conflict. The internet has allowed my generation to reach out in ways never before.
But is it so hard to believe that perhaps we humans have a finite amount of empathy in us? We seem less locally empathic than Gen X, but more empathic globally. Could it be that Gen X simply is spending the same amount of empathy we are, but just at a different level?
In layman's terms, I can only give a shit about so much. And while I give a shit about the plight of Palestinians, the victims of the oil spill, torture victims, my parents, my family, people struggling with unemployment, NASA, and various other entities...it just doesn't leave me a lot of empathy to give to my friends who are whining about God only knows what.
The genetics of this make sense...it only makes sense for the brain to have enough empathy built into it to support empathy for an individual's local tribe. Extra empathy just costs the individual resources. My generation hasn't changed how much empathy we have...we've simply changed the definition of our tribe.