In the first ten minutes of Saints Row IV, I climbed up a nuclear ICBM, in flight and on a course to hit the US, and disabled it, as Aerosmith's Don't Wanna Miss a Thing played and my teammates told me how much they'd miss me and appreciated my sacrifice. Naturally, I survived and became President of the United States and named my street gang as my aides and cabinet. After that, things started to get implausible.
I've never played a Saints Row game, I don't know what the franchise is all about. It didn't seem to matter. Within moments of my stint as president, the Earth is invaded by malicious aliens, and I was transplanted to a version of The Matrix, along with most of my old street gang and millions of other humans. The premise is nonsensical (ok, so was The Matrix), but I didn't give a shit. Every moment spent running and flying over the virtual city of Steelport was a joy. The characters were funny, the soundtrack is used to great effect, there are dozens of hilarious goofy gimmicks, a large dose of lampooning other games, and the game has a very satisfying campaign.
The gameplay in Saints Row IV might seem, from the previous incarnations, and from the trappings, as if it's a Grand Theft Auto clone. This isn't right. In fact, the game eschews everything that's irritating about open world games. It incorporates everything that was great about inFamous and Prototype 2, but somehow feels even more liberating than those were.
Did I mention that it passes the Bechdel Test? In spades.
In fact, the game begins (after the ICBM sequence) with a character creation sequence, and not only can you be a man or a woman, but there are a few voice sets for each one. My short-haired, female "Player" (get it? The character is called Player.) felt as if the game was designed for her. But, in fact, I could have chosen to be a man. Anyway, she is thrown into the Matrix ... err ... simulation just like Niko Bellic is thrown into one. The main difference is that she shortly gained super powers courtesy of her friend hacking the simulation. (One of the voice sets for a male is "Nolan North Voice", which is hilarious in and of itself.)
It starts pretty powerful. I can run very fast (i.e. I will never use a car) and can jump very high. By the end I could glide for a mile, run up the sides of buildings, fling fire from my fists, pound the ground from 1000 feet up, kick my enemies in the balls, freeze them, and then punch them into tiny pieces. Oh, um, there are guns, too.
I've found the time to play most of the major open world games. From the previously mentioned inFamous (1 and 2), to Prototype, Sleeping Dogs, The Amazing Spider-Man, Red Dead Redemption, GTA IV, and, of course, Just Cause 2, there have been quite a few that were enjoyable. The measure of success is whether I feel like just hanging out in the world, collecting things, playing. All of these listed made me want to do that, but nothing like Saints Row IV. The mechanics are perfectly tuned, I could execute incredible leaps with accuracy. I approached story missions slowly, not because they were bad, but because I knew they would hasten my finishing the game. Even the races were fun. The fucking races.
And even though the plot is silly, and self-consciously so, it never got tiresome. There's enough wit and out-and-out humor in the dialog to keep me smiling. From singing Biz Markie with an old friend in the car to trying out my new Dubstep Gun, there were numerous moments that had me belly laughing. The game has an incredible mix of different tasks and scenarios, changing from an open world game to a 3D platformer, a 2D tank game, a side-scrolling beat em up, and a Metal Gear Solid-esque stealth game at a breathless pace. It's one thing to lampoon and reference other games, but to do so and actually have it be fun? It almost beggars belief.
The soundtrack isn't nearly as big as GTA or Sleeping Dogs was, but what was there was excellent and eclectic. Several of the soundtrack songs come up during missions, and their obvious reuse does stick out a bit, but hey, Deep Silver doesn't have Rockstar money.
The plague of non-reproducibility in science may be mostly due to scientists’ use of weak statistical tests, as shown by an innovative method developed by statistician Valen Johnson, at Texas A&M University in College Station.
The article goes on to specify an amazing new Bayesian test the person invented which shows that scientists are using "risky statistics". Ok, no.
Let's talk turkey for a minute here. This is how an experiment actually goes most of the time:
Your boss has an idea for an experiment. It's based on some prior work which was kind of understood. That's what funding is given out for.
You spend a ridiculous amount of time and energy setting up the equipment, fixing all the electronics, plumbing, and Labview problems.
You do the experiment and see nothing.
You continue fixing things.
You finally get something. You optimize conditions, you collect data.
You analyze the data, and it's hard to explain. It's noisy even when you did everything you could to make it good.
Out of the hundreds of plausible explanations for the data, you choose one that your boss likes, and you write a paper. You write a paper because that's expected of you regardless of what the result was. You can't not publish a paper just because the data makes no sense.
The statistical test isn't the issue. The issue is that experiments are hard. Unless the phenomena are so well known that nobody would bother doing an experiment in the modern day, the equipment to measure them is experimental, home built, or being used in an unprecedented way by people with no technical training (grad students). You can sit and mangle the data all day, apply any number of tests, make your curve fits more robust, whatever. The chances are that your findings are going to be debunked or at the minimum refined. This is ok! But it means that cutting edge stuff isn't that reproducible.
Sure, I have encountered statistical fallacies in other people's work. But in none of those cases was the cause solely the ignorance of the researcher. Rather, it usually was because someone was trying to fit something that was unfittable because the data was extraordinarily hard to collect and therefore noisy and ambiguous. Or, someone quoting an error estimate based on a few data points, which is not good, but only because getting each of those data points took a month. If the reseacher had a ton of data, the statistics would be better because the statistics would be better. Then, and only then, should we start to delve into Bayesian notions of cause of the data.
For several years all of my computing, note taking, and organization has happened through a few devices. Most of my mental computing work (actual writing, modeling, programming) happens on my laptop, a 3 year old Dell Core 2 Duo. My note taking, conceptual thinking, and written calculations happens on an iPad (3rd gen) with a Jot Pro stylus. Finally, all my creative work, any CAD stuff done at home, and gaming are done on my PC. The PC is irreplaceable, as no portable system can do what my PC does. But the two mobile devices could in principle be combined in the Surface Pro 2.
Any replacement would have to support handwriting, have long battery life, and be able to run my Windows applications, such as Matlab, R, LaTeX, the Adobe products, and Office. The Surface Pro 2 is a Windows 8 laptop with a Wacom digitizing pen screen. Rather than a "convertible", it's an actual tablet with accessories that emulate a laptop. There is a snap-in keyboard and a "kickstand" which props the system up. It has a USB 3.0 port and mini-display out, so it could be used with most external hardware and with a projector.
Here are my first impressions:
My interaction with the Microsoft Store was sort of awful. I pre-ordered the Surface Pro 2, and selected the Touch cover as an accessory. When it came time to ship, my credit card company flagged the purchase as a possible fraud. I called MS, and they said not to worry, I would still get my laptop on the date promised. This was a lie, as a few hours later I got a notice that my order was canceled. A second call to MS allowed me to re-pre-ordered it, and I only got it a couple days later than expected. The first MS store rep also told me that the Touch cover for the Surface Pro that I ordered wouldn't work, which was also a lie. Fortunately, MS has a person who maintains their Twitter account that knows what he or she is talking about, more than I can say for their phone representatives.
I do not recommend the Touch Cover. As a pretty good touch typist, it's barely usable. I have about one error per two or three words. Probably will return it.
Logging in to the Surface is a hassle because your login is tied to your Outlook.com account. I keep very obscure, long, random passwords for all my online accounts, including my Outlook.com account (which ordinarily I have auto-filled-in by 1Password). Since MS uses that info to log into the account, I had to type that awful thing several times. Getting the machine not to require a password every time I woke or booted the computer was non-intuitive, because ... Windows 8. The option is several layers into the control panel, and it wasn't found by search. Eventually, I got it to wake without password and boot only with a 4 digit PIN. That should be an option when you set up the machine.
The screen is a pretty good pixel density, which is a double-edged sword. Obviously, better pixel density improves anything scalable, such as video and apps designed for a scalable desktop rather than one based on fixed pixels. But most applications weren't designed for that, including really big major products like Adobe Illustrator. There is no solution to this at present, nor do I expect there to be for CS6, since it is now obsoleted in favor of Adobe's current business model as a hostage taker/blackmailer. At my native resolution, the font on Illustrator menu items looks like 6 pt font. That's not easy to read.
The experience with the MS app store has been kind of miserable. Many commonly used apps available on iOS, such as Feedly, are not there. The ones that are there are fairly gimped or totally nonfunctional. The Facebook app, for instance, does not allow system level sharing (such as "share my desktop screenshot") commands, which you would think it would. The Twitter app DOES allow for it, but using it permanently breaks the app. I'm not kidding, the app has to be reinstalled to get it working again. Also, scrolling in the Twitter app sucks balls. The Evernote app, called "Evernote Touch", should not exist. Microsoft should be throwing money around to make this better right now, because if this is their strategy going forward they have a serious issue.
The kickstand angles aren't bad (there are two), but you definitely will have to conform yourself to the machine rather than the other way around. Contrast that with my laptop which accommodates nearly any angle of the screen.
Microsoft drastically reduced screen real-estate for scroll bars in Windows 8. Then they released a device that uses people's fat fingers. WHY?
Battery life is looking like 8 or 9 hours for a typical day, which is excellent.
Evernote proper has a note type called "Ink Note", which combined with the stylus is all I've ever wanted out of paperless technology. Evernote (NOT the app) works very well all around on the Surface.
Now that I'm using it, I have to say that Windows 8 really IS as bad as everyone says. Restarting is a chore, finding settings is ludicrously hard. It's an all around UI failure. I've been on Team MS for a couple decades, but I'm really not so sure now.
I'm not at all convinced that the stylus storage option using a magnet is reliable. They really should just include a band to push it into.
I guess Microsoft skimped on RAM, because during my first day of usage it said I needed to close some programs, like Firefox, in order to run other things like games. That's sad, because RAM isn't expensive. I get the reasoning, since using an SSD as a virtual cache would be a very bad idea, but couldn't they have spent a little extra for the $1000 model and put in 8 GB?
I tried playing a lightweight 3D game, one of Telltale Games' Monkey Island episodes, and it basically worked fine. This definitely is not a gaming rig, but point and click type ought to work fine. I didn't even have to turn the settings all the way down.
Playing Youtube videos fullscreen could be better. The "fullscreen" icon is really small compared with my index finger.
Why would they not make the screen lock when you close the Touch Cover? It's bizarre.
Microsoft ostensibly has made a thing that you would switch to from an iPad, but they made its functionality different enough from the iPad to be confusing. When you push the screen lock button the computer actually goes to sleep, as a laptop would, rather than just turning off the screen, as an iPad would do. Contrast this with what happens when the screen turns off from inactivity, where the computer is still on! Very weird. Also, given enough time asleep, the Surface appears to Hibernate, requiring more time to turn on the next time. This isn't like an iPad, which is always ready to go. MS has muddied the water by copying some features (a lock button) but having it do subtly different things.
The on screen keyboard is better than iOS's, except for the fact that the screen doesn't react to it. Menu items or buttons that occupy the space where the keyboard are simply remain covered up until you hide the keyboard again. Poor.
Virtual Reality has been a joke for years. For all the derision heaped on nonexistent flying cars, jetpacks, and cures for cancer, no seemingly straightforward bit of technology has been vaporware for so long. 1992's film The Lawnmower Man enticed us with the idea of entering a fictional world, a facsimile of Star Trek's Holodeck, without the need for matter/energy conversion or force fields. But the unfeasibility of the technology in 1992 is fairly obvious when you look at the problems: virtual reality headsets, the central component of such a system, would have had to be made of cathode ray tubes back in the time of Lawnmower Man. CRTs are heavy, huge, power hungry, high voltage, glass monstrosities—they therefore make bad apparel. Moreover, the devices needed a way of tracking the orientation of the headset, so that when you turn your head the image in front of your eyes changes accordingly. No good way existed to do this with any real speed, certainly not cheaply. And two of them are needed for stereoscopic (3D) viewing. VR headsets could be produced, but were so bad and expensive that they never really got beyond the lab or high-tech industries.
In the late 90s and early 2000s, head mounted displays using LCDs were released by several manufacturers, but these were normally quite low fidelity and/or obscenely expensive. Large, low density TFT-LCD screens, often backlit with hot lamps, were beginning to be affordable, but nothing small and energy efficient. Consumer models that were true head-tracking displays never materialized in anything above 600 vertical pixel, well below that needed for realism. There wasn't a large market for small high-density screens.
Screens weren't the only problems by a long shot. Lawnmower Man may have shown Pierce Brosnan a terrifying image of his test subject Jeff Fahey, but the graphics were deliberately crude and abstract due to the limitations of computer, and especially graphics, processing, at the time. The technology that would be needed to allow for walking and hand movement would have been no more sophisticated than a joystick; practical motion capture like that used in The Lord of the Rings was quite a bit off, requiring fast, high definition digital photography and computers to parse the data in real time.
All of these problems have been solved. In fact, you've likely already interacted with many of the solutions. And now there are people preparing products that combine the technologies into inexpensive, amazingly lifelike virtual reality. It's coming next year, and a lot of people have already been using it for months. It's going to be incredible, a watershed moment for interactive media.
The solution begins with a device my actual grandmother literally owns: an iPhone. Specifically, the iPhone is a high-pixel-density screen married to a number of micro-electromechanical sensors and a battery with a whopping energy density despite being small and lightweight. It can sense orientation, display a high definition image, and doesn't require a huge power cord or hefty weight to do it; that's practically all a head-mounted display has to do. Put two of these suckers into a piece of plastic, add some optics, and you have the Oculus Rift.
To make a realistic, brain fooling virtual reality setup, you have to get several things right. First, the eye has to focus far away without simply making the screen look far away, which means that nonspherical lenses have to be made. This allows the screens to be physically near the eye but appear to be far from it, and still retain very large field of view. Peripheral vision has to work, and it does with the Rift. The head tracking is incredibly fast, fast enough to be capable of fooling people into thinking the experience is real. Among those people who have ordered and received the Developer's Kit, many report that going inside the Rift is like going into another world; exiting it is disorienting, as you feel you are shifting from one reality to another. If someone taps on your shoulder while you're in the Rift, things get very uncomfortable. Horror games are TOO scary because all of a sudden you really feel like you're there, in danger.
The Rift is also doing what it needs to do to integrate with current game makers, people who have spent the past 20 years pushing for huge amounts of graphical computing power to run their sophisticated 3D game engines. Oculus has made the task of programming a Rift game purportedly easy, something that past head-mounted displays never did. More importantly, they seem to have achieved critical mass, with game makers programming specifically for the Rift and scads of people trying out entirely new types of virtual experiences, such as a game where you experience getting your head cut off. Sony has announced that they are working on a competitor VR display for the Playstation, and such me-too-ism from a major gaming corporation shouldn't be taken lightly.
Oculus is still targeting a $200 launch price. That is highly affordable, much cheaper than previous head-mounted displays. It seems to tick every box that would make an ideal visual experience. And it's so big that it's pushing ahead inventors to work out solutions to the rest of virtual reality.
Take walking around. How would that work in a typical home? Voila, a company called Virtuix has developed a way of moving within games by actually walking or running with your actual feet. The shoes slip on a slippery surface, and the motion is digitized by sensors on your ankles, while the support structure holds your body upright. Ben Kuchara said it felt "pretty fucking great." This is a prototype of the first generation of such a thing, and it works well with any game.
What about your hands? The Wii, Kinect, and Playstation Move were the trifecta of camera-based motion tracking of the current generation of game consoles. They all work by using a digital camera and a reference object. The Wii has the camera in the controller which looks at the "sensor bar"; the Kinect and Move both use a stationary camera looking at the person. The Move tracks the size of spherical lights on a controller to determine location quite accurately. The Kinect works on the same principle but instead looks for a skeletal image in frame. A company called Sixense has extended this idea and build 3D tracking hardware that will watch your hands and render them in the game. The sensors can be attached to any part of your body to add more degrees of freedom to the virtual reality experience:
I have the full range of my normal movements, so everyone makes sure I have plenty of space to swing my arms and explore. I pick up a barrel, wind up, and throw it over the balcony, and watch it sail off into the distance.
Walking, seeing, hearing, manipulating are all possible in a 3D stereoscopic, high definition environment. What's left? The other senses. Could haptic feedback provide tactile sensory information? This group made a bunch of touchable buttons in mid-air using an array of ultrasonic oscillators to create focal points of touch in the air. We already have a small measure of force feedback in controllers through rumble. Chairs like this one, when used with the Rift, reportedly provided a full sensory experience much like being in a car, though they are currently quite expensive. But if the Rift takes off like I expect it to, peripherals for the virtual reality experience are going to finally undergo economies of scale.
Now, yes, there are going to be some hiccups. For one thing, if you only have a Rift, playing games using a controller reportedly causes nausea in many people. The sooner technology like Virtuix's walking/running setup catches on, the better, because by all accounts this is due to the disconnect between what the brain thinks is happening (movement) and what the body is saying (no movement). Also, you're going to need a pretty good computer to use the Rift. Many currently recommend a Geforce GTX 660 video card, which is about $175. In a world where many people's computers are Dells with cheap Intel chipsets, that isn't going to fly. Laptops are also basically a non-starter.
But for those of us already prepared for the device, it can't come soon enough. Developers won't just be developing their games to use the Rift, they'll be developing games that are FOR virtual reality. That means that games that would previously be "boring", where you aren't constantly shooting someone or jumping out of helicopters, can be a real interesting thing. A game like Myst wasn't very compelling, but what if you were actually walking around it, manipulating things? Who wouldn't want to do that?
I also think this is a boon for home video. The current technology for 3D movies is passive polarizer glasses, both in theaters and on 3D TVs, and each of these lenses cuts 50% of the light out. The Rift already has two screens, one for each eye, so it can run at the full brightness all the time. Moreover, on an airplane you can watch a movie on a virtual screen in a virtual movie theater---abstracting away the stranger you're sitting next to seems like a great idea to me (that is, until he pokes you in the arm to ask to go to the bathroom).
The time is ripe for VR. The Rift ships next year. Hopefully lots of other people want it as much as I do.
Two things demonstrate to me that Jeopardy contestants don't understand rudimentary game theory.
Case 1: In many cases you have a score where one person is way ahead, and the other two trail by an amount so that it's a "run-away", meaning that even if the second place contestant bet the maximum amount in final Jeopardy, it's not possible for him to catch up. Example:
Suppose there only a few clues left with no Daily Doubles available. What is the best strategy for Contestant 3? If there aren't enough clues in that category for him to get to $6000, Contestant 3 should stop buzzing in. Contestant 1 would be foolish to bet a single dollar in Final Jeopardy if he has twice his nearest competitor. Contestant 2, on the other hand, can easily earn enough to get to $6000. Therefore. By continuing to buzz in, Contestant 3 does nothing but take potential money away from Contestant 2, and increase the likelihood of a run-away. This hurts himself, not just Contestant 2, because in the event that both Contestant 1 and 2 miss the question, Contestant 3 can then win.
Case 2: In any of the two-day contests, such as the finals of the Tournament of Champions, wagering a huge amount in Final Jeopardy of the first game is extremely imprudent. Instead, they should treat that final like any other Daily Double. Although the score is reset at the beginning of the next day, the dollar figure that matters is the total of both days. It's therefore insane to bet all your money, unless you would normally bet all your money on a Daily Double (hint: whenever somebody bets a large amount on a Daily Double there are gasps in the audience).