Jump to content
Banner by ~ Ice Princess Silky

A Blithering Div

User
  • Posts

    618
  • Joined

  • Last visited

Blog Entries posted by A Blithering Div

  1. A Blithering Div
    Must be good right?
     
     
    The visuals and overall aesthetic are fantastic:
     

     
     
    If there is one thing Supergiant should be known for, it is their art department. It's vibrant, colorful and most importantly atmospheric. The soundtrack is alright. It's up on soundcloud somewhere should you wish to here it.
     
    The game begins with Red (you) pulling that sword known as Transistor out of some dead guy. You've had your voice stolen but still make vocal noises so I assume its a metaphorical voice. This is one of the few remnants of Bastion that seeped through to this game in that the story is intentionally made vague in order to make it seem deeper than it is. You have various skills called "Functions()" that can either be one of the four active skills, an augment to one of those skills or a passive buff. You have a set amount of "memory" so you can't just mix 'n' match. You fight an enemy force called "The Process" because...alright this game is rubbish. 72 minutes in and I cannot bring myself to play any more of it.
     
    One of the reasons being is the shitty combat system. You have a health bar but you can't heal unless you have the life regen function (which works more like the mourneblade from Castlevaina: Symphony of the Night, a much better game). When you "die" you either overload transistor which from what I can work out, doesn't do much or you lose an attack. Pressing space allows you to "plan" your attacks within a given time frame. Sounds like a good idea until you do that and can't attack anything until that ability cools down. Once you lose a function, you have to go to two access points (upgrade stations/save points similar to those found in Dino Crisis 2 and Devil May Cry, both being better games) to get the lost functions back. Because this game is kind of "hack and slashy" this is a major problem.
     
    The other is the story. 72 minutes in and I know Red is a singer that was supposed to do a concert (I think) but couldn't because that's when the calamity "it" happened. I don't know why the process is there. The process wants me dead because vagueness and the reason I should care about any of it is beyond my simpleton brain.
     
    Let's address the elephant in the room. Yes, this game was made by the same people who made Bastion. Bastion is one of my all time favorite games. I didn't expect this game to be like Bastion at all. While it does share some things, it isn't the same. However it probably should have been because outside of the visuals, Bastion did everything better. Better combat, better storytelling, better story in general, characters that were given life and a better sound track.
     
    3/10 do not recommend
  2. A Blithering Div
    Kaveri is about 30 hours away from launching as of time of writing and some people such as myself are just itching for official benchies. This is pretty much AMD's Hail Marry to stay relevant on the x86 CPU coin. I understand the shit storm that will come with that statement but I have good things to say about it I swear.
     
    I don't think Kaveri will change much on the desktop as GloFo's 28 nm process isn't able to clock as high in the same power envelope. That kinda sucks because I like my desktop rigs but oh well. However, the "upto" 20% IPC claim is appearing to hold true: Sauce. This is excellent news for their mobile lineup. This is where I see Kaveri really kicking ass should OEMs actually use the damn chips. Since mobile doesn't have to reach insane clockspeeds, Kaveri will easily outperform mobile Trinity parts. That and that alone would be enough but you also get a better iGPU and better GPGPU performance. Considering how well mobile Trinity did against the mobile i series at the time, this may very well save AMD. That said, I may pick myself up an A10 7850k for a 24/7 BOINC PC.
     
    In short, mobile Kaveri is gonna be where it's at.
     
    Addendum to address the people that will inevitably mention overclocking: According to the source above, 1.5 volts was needed to get to ~4.5GHz without the iGPU active. Better than Phenom II but christ does AMD need to get out of the WSA for GloFo.
     
    Unrelated note: If anyone wants another member for their BOINC team, send me a PM or something. I do Rosetta, Milkyway, Einstein and SETI.
  3. A Blithering Div
    Who's down for some good ol' fashion OS bashing? I know I sure am. This pertains to NT 6.3 shipped straight from Redmund, WA to a warehouse and then to my domicile. Yes, I'm using Windows 8. No, it isn't a fresh pile of dung, more like a rehashed and repacked pile with a fancy ribbin on. I can't really comment on the not-metro UI as shockingly, I've only used it a few times and two times was to pin the control panel to my task bar. As such we'll gloss over the elephant in the room for the most part.
     
    The installer is still bleh. Click next a few times, format the C: drive and you're pretty much done. No options for designating certain drives to directories until you get to the full OS where you can software RAID/JBOD with Storage Spaces and take a performance hit for not using a hardware controller.
     
    The Explorer UI is roughly the same but now uses drop down ribbins instead of a ribbin with commonly used tasks in conjunction with a menu bar. 3 clicks to shut down via GUI rather than 2 for Win7. Performance is roughly the same between 7 & 8. I can't help but get the feeling that not-metro UI being disabled would give a bit more performace. Task manager is a bit funky now. The Prcoess tab and Details tab do about the same thing but Details has a bit more meat on the bone with Set Priority and Set Affinity being in its context menu among other things.
     
    And now to go office space on the dead horse. Windows 8 is the worst version of windows Microsoft has ever released. System Refresh is pointless as its essentially a fresh install that leaves old system files behind you can't delete wiithin the system. It also "deletes your old programs by moving them into Windows.old so you can presumably try to salvage something (IDK what but something dammit). You have to be able to boot into a full desktop in order to get into safe mode as it requires you to go into advanced settings in not-metro. Event Viewer gives you very little information on blue screens of death (though I don't think that's a new development). Not-metro UI is part of safe mode. I am not making this up. You cannot do a System Refresh or Reinstall if UAC is set to maximum for reasons I can't fathom. All windows 8 consists of is more abstration from actually being able to control your computer. That's all it is. It has no compelling features over Windows 7 to make it worth while. Sure, it does in fact boot faster, however A. I'm on a desktop that gets turned on once and turned off once and second, sleep mode doesn't suck anymore. And if you really want power saving in a portable you should be hibernating anyways. All of these are things that I have had experience with under Windows 8 and quite frankly, its shit from any angle.
     
    TL:DR It's rubbish.
  4. A Blithering Div
    Back in the really olden days of PCs, it was common to buy a $600 "breadbox" computer (Commodore 64 anyone?), hook it up to the TV and away you go. More fancy computers like Apple 2s and IBM PCs had their own monitor. Now the separate PC were usually progressive scan. That means that the whole frame is drawn on the screen rather than opposing alternating lines (or interlace). Fast forward to the HD craze and now people who I can only hope do know better, state that their computer monitor is 1080p or 1440p or something like that. Now, at this point in time, PC monitors have been progressive scan for decades and the large CRTs had pretty high resolutions. I have an old Dell CRT that tops out at 2048x1536. Yet for some reason, it is now necessary to state that your PC monitor (not a television where it actually differs) is progressive scan. Its not even short hand. If I said my monitor was 2160p, would any of you even guess that was I was talking about a 4k monitor? What about 1200p? 1050p? 4360p? Seriously, its annoying to see 1080p and 1440p when typing out 1920x1080 is far more informative. How? I can see how many horizontal pixels there are. 1920x1200 tells me that its A: high resolution, B: its on a computer monitor because TVs market with xxxxp and C: Its aspect ratio and in this case its 16:10.
     
    So, in short stop the 1440p and type out 2560x1440 please.
  5. A Blithering Div
    I've been reading AT forums for a while now because I want to know every fuzzing detail there is to know about Intel's upcoming archecture codenamed "Haswell". In one of the threads, there was something about a show called Computer Chronicles and as I'm easily amused by watching fast screens, I watched the episode where they were comparing Intel's Pentium CPU to that of a 486. I had a good amount of laughs as they're talking about motherboards with 256K of cache and having to use a heatsink because they run so much hotter. It got me thinking about my first computer. The first computer I owned wasn't technically mine at first but I was pretty much the only one using it. It was by far the most unbalanced pile of rubbish made from parts that I'm assuming were found in a bin somewhere. It had an Intel Pentium MMX at a blistering 233 MHz with 24 Megs of SDRAM, a Cirrus Logic video card with some presumably very small amount of VRAM and an unbranded ISA sound card all piled on a miserable Baby AT motherboard I've long since lost and I'm pretty sure it had a SiS chipset as well. It also sported a 10 gigabyte hard drive and another 15 gig drive I didn't find out about til about a year after I got a socket 478 rig. Ran Windows 98 like a champ that just had diarreah. But it played games within reason. I could play Doom very well, along with Starcraft, Tiberian Sun, Unreal, Duke Nukem 3D, some crappy NASCAR game, Sonic CD along with a Sega Collection for PC and Red Alert 2. Out of all of those, Red Alert 2 was the worst. 1 in-game second counted as 2 real time seconds. On a 2 player map. At 640x480. Still fun though. Couldn't get the sound to work in Unreal or Duke Nukem either, but I'm tellin' you young whippersnappers that those were the days. I remember trying to get the sound to work for quite some time. It wasn't muted or anything, but after messing with the bios, I gave up. Fun times they were.
  6. A Blithering Div
    Both companies unveiled new gaming devices at CES this year. Nvidia unveiled a mobile handheld called Project shield: http://www.theverge.com/2013/1/7/3845282/nvidia-announces-project-shield-handheld-gaming-system It is essentially a controller with a 5 inch flip screen on it. The most important aspect of this is the new Tegra 4 CPU which is a quad core ARM A15 based chip with what I'm assuming 72 Kepler based CUDA cores (the next nexus 7 should be pretty awesome should they use it). The device itself runs android 4.1 and will allow users of Nvidia's 6xx series GPU stream games to this device via local wifi. Now, my question is one of why would I want to buy this? Its literally a 5 inch tablet with a unattachable controller that will either not fit in a pocket or be really uncomfortable which defeats the mobility aspect. On top of that, what incentive do developers have to make games for this thing? Furthermore why would I want to stream games from my more powerful and higher res PC to a controller? All of that being said, I could see this doing really well for emulation but if that's all you're getting it for a Pandora may suit you better. One last little tidbit, the audio system rivals laptops equipped with beats by dre. I can't imagine that being hard to do but whatever.
     
     
    If you find that device not particularly useful, it gets better. Razer, the gaming mouse company behind the Blade, has announced a 10.1 inch gaming tablet: http://www.pcmag.com/article2/0,2817,2413967,00.asp Apparently, those who purchase Razer products really wanted a 10 inch dockable tablet for playing their extensive PC game library on the go because laptops are just too inconvenient. I mean I don't get the Shield, but I could probably find a use for it. This thing however... Why? What is the market this thing is targeted for? People who literally cannot find a 13 inch laptop with a dedicated gpu? I seriously don't understand the purpose of this. Okay, its lightweight but that doesn't really matter as I'd have to carry "docks" to play games on it. Sure, I can plug another battey into it, or I could just get a larger laptop battery. To top it all off, the base model is $1 short of a grand. One thousand dollars for a 10" tablet you can play games on. Or, you could buy a laptop and be better off if you don't mind a few extra pounds.
     
     
    To end this on a good note, I'm going to change subjects a bit. Canonical has announced that it is working on Ubuntu for cell phones:
    I'm excited for this. Very much so. Whether or not it succeeds I can't say. It is a bit clunky right now and definitely needs some more polish. However, I like the way it looks. I love the fact that its pure GNU/Linux without Java on top which by default means better performance and battery life. I especially like the fact that its ubuntu that I know with a seperate mobile interface. I.e. the way it should be done *cough* microsoft apple *cough*. Now I'm just pulling this one out of the either but what if their "dock" mode they've been working on is used here? What if I can run Ubuntu on my phone, dock it via usb & hdmi, get a full desktop experience without losing phone features like telephony and GPS, and then undock it when I'm done and still be in 1 OS? If that were to happen, I would joy puke everywhere while crapping bricks. I always wanted a true pocket computer and this could be the deal.
     
    But enough of my diatribe, what do you think of all this?
  7. A Blithering Div
    Mini rant ahead...
     
    As the title suggests, there is a new MLP game for Android and iOS devices by Gameloft. It's like a pony version of Cityville. I'd love to tell you that this is a first hand experience but as it turns out I haven't played either. I'm not interested in cityville and this new game:
     
    Does not support Gingerbread
    Does not support BlueStacks but thats not exactly shocking
    and lastly does not support Android 4.0 on an x86 platform, confuzzling as android is linux kernel plas Java, which means ultimate portability of code.
     
    Just a friendly heads up should you either not own an iOS device, have a gingerbread device or both/something completely different.
     
    Now for some close up animals with a wide angle lens:
     

  8. A Blithering Div
    I find it quite astonishing that display technology went from a high voltage electron gun fired at Phosphorus on the back of a think glass vacuum tube to independant crystals moving in front of polarized film to individual lights on what is a glorified potentiometer. Yet, the humble speaker hasn't funademtally changed since its inception. It's still a piece of material connected to a coil of copper wire with current passing through it around a stationary magnet. The materials that make a speaker have changed over the years from paper to polymers and the magnets changed from crappy Iron to super awesome Neodymium but thats it. When we want better sound from them, we add more of them. Then we add crossovers, up the wattage, maybe throw in a port for bass reinforcment and call it a day. I would like to sit here and tell you that there is something better, but I can't. I haven't heard of a single alternative to the voice coil for audio reproduction. And yet, I'm ok with that. The speaker has done its job very well over the past ~100 years. A little time and a bit of cash can get you a pretty decent system now adays.
     
    However, I would like to know who thought it was a good idea to put button sized speakers in portable electronics so I can smack them repeatedly in the ear. Laptops are far more guilty of this than other devices. You've got room in this thing. Stop using what are arguably a waste of space. Small speakers don't have to sound like garbage either. I own several "capsule" speakers and they're easily the best small speaker I've heard and will probably ever hear. Why aren't these or drivers similar to these in laptops? There's no excuse for it. My music should not have to sound like a cymbol truck crashing into a salt factory because you couldn't be bothered to put in a halfway decent sound subsystem. The drivers used in the "capsules" aren't that big either. The diameter of the driver itself is four centimeters. Four. Four centimeters. You could easily fit 2 of them in a laptop. You could probably fit 2 of them in a 7 inch tablet if you designed it right. By comparison the width of the speakers in my craptop is 2 centimeters but they give you two of them for that awesome stereo sound. Of course HP is making laptops with audio systems by Dre or whatever and all I can imagine is the above but with some terribad bass boost akin to Bose's AcoustiAss module. I think this will change though. Simply because its so bad that anyone who imporves it can easily market it. Moreso than a not shit cell camera. Especially if a cell manufaturer somehow figures out how to stuff a 40 millimeter driver in their phone. Imagine being able to play your music on your phone and being able to hear it over the most mundane background noise and with some actual accuracy. Same for tablets and laptops. Enough of my diatrible. What say you? Do you know of any voice coil replacements? Do you think one day laptop manufacturers and tablet manufacturers will actually up the speaker size? Do I need to take some Pepto Bismal for the crap I'm spewing out of my mouth?
     
    Note: This rant came about because someone at my work decided to play crappy dubstep over an ipod speaker, making it worse.
  9. A Blithering Div
    Cell Phones have been toting cameras for nearly a decade now if not longer. The technology gets smaller and better and so we see an increase in the uber awesomezorz mega pixels and yet not one cell camera that I've seen has handled any lighting condition outside of flouresent or natural with any form of quality. Every picture is rather dark, a bit washed out (or sepia toned) and usually grainy. Why? Digital cameras aren't new. I understand that the actual sensor in a cellphone camera isn't big by any standard, but why is all the focus on mega pixels rather than making "fine" image quality mean something (pun intented)? I own a Nokia N900. It says it has Carl Zeiss optics on the cover to the camera, yet its easily beat by a 12 year old 1.3 megapixel Fujifilm digital camera that uses an 8 megabyte SmartMedia card. I did install FCam and it does offer a bit more control over things, but low light pictures still look crappy. Does it really cost that much board real estate to put a decent CMOS sensor in a phone? I'm not asking for a pocket sized dSLR, but I am asking for a decent digital camera in a ~$500 phone.
     
    Note: I didn't actually pay $500 for my phone.
  10. A Blithering Div
    After a 14 year hiatus from the series in its entirety, I got Pokemon Y. The last game I played was red so the jump from gen 1 to 6 is not so much large as more or less consisting of "WTF is this thing?" I've made it a bit past the 3rd gym thus far so I'd like to put my thoughts in you.
     
    First, the negatives as this for once is going to be the shortest. There's a bit too much carry over from ye olden days. I understand it may be due to balance but there's no reason to only have 4 moves available at a time or only hold 6 pokemon in a bag that can seemingly carry an infinite number of other items. The battle music isn't that great either. Similar structure to the chip tune but its not exactly tension building.
     
    These are just a few drops of lemon juice on an otherwise quite tasty pumpkin pie. Maybe its just some nostalgia coming up but this game is quite fun. Pokecenters have the store in them so there's no need to hunt down a store. Random trainers are scattered throughout to heal your pokemon so treks back to said center aren't always necessary. I especially like the super training. It's almost as if they realized that its best to actually have some control over stats in an RPG. The Exp. Share is the best item since rare candy. I also like that they got rid of PC item storage. So now I can just carry everything thus far and not waste time at the pokecenter like in old country. It appears 14 years of tweaking and refinement have paid off. One of my friends is irked about mega evolution but I don't understand it. In a game that has critical hits that can down equally level pokemon that aren't weak to the attack I don't think its that big of a deal and welcome it as an equalizer of sorts.
     
    On a tangently related note, I find myself paying more attention to the story. As in there actually is one now. From what I remember of red it was document every pokemon in the game, catch said pokemon (not missingno. otherwise you'll be missing a psyduck among other things) and get all eight badges. Now, there's a point to the quest. Say what you want about mega evolution but I think it gives going about the game more purpose rather than catching them all and filling a pokedex. I also find myself considering the options more. SPOILER: I got to the point where you can obtain a Lucario and I thought about it for a second. Yes, sure it may have wanted to tag along but what about the trainer? Would it miss her and vice versa? What happens if I say no? Does it get cross and become a nightmare later? I did take it along for the ride but there was some thinking behind it.
     
    All and all I'm pleasently surprised at how much I enjoy this game. Same basic concepts but yet it still holds its charm. I even more surprised that 14 years later I hesitated for a few levels when it came to evolving Pikachu. It was only like 2 or 3... Alright it was more like 7 but that's not the point. The point is karate chop is bullshit.
  11. A Blithering Div
    Well after a month and a half of Intel's Haswell CPU hitting the streets, I finally splurged on an i7 4770k to replace my 4 year old Phenom II. The difference is phenominal. It really is. My framerates in TF2 alone are far better. The lowest I've seen was ~70 FPS on a 32 man 2fort server which certainly beats ~40 or at least it would. See a long time ago when I bought my Phenom II, microsoft had a deal on windows 7 for college students where you could buy Windows 7 Pro for $30. It was great, problem is, the backup disc has no product key and I am no longer enrolled (hoping to change that very soon but I digress).
     
    From what I've read in the microsoft support center or whatever it is, all signs point to "sucks for you, pay up." Now currently I'm running Linux Mint 15 with KDE for teh frameratez. But there's an issue. Last year I made the error of purchasing an MSi Raedon 7870. Under windows, its fine for the most part. Under Linux, performance is less than stellar because well AMD and driver don't quite go hand in hand. In fact I have to turn down some settings in order to get comparable framerates to windows (game running natively, not in WINE).
     
    So I've come to a bit of a conundrum. Do I shell out for another Windows 7 key, "upgrade" to Windows 8 as its about the same price, stay with Linux Mint, deal with it and hope things get better or get an nvidia graphics card and hope it resolves the issue?
  12. A Blithering Div
    TL:DR See Bottom
     
     
     
     
    A long, long time ago (aka a few months ago) my much beloved square box COWON D2 met its fate after I got angry and punched the screen because of metadata malarkey. I'd show you a picture but I either lost it or threw it away and don't remember doing it, so just imagine someone threw a wiimote at it. Shortly after I picked up a Sansa Clip Zip after doing a whole 3 minutes of comparing it to the Clip+. The reasoning behind this is while its not as big as my D2, nor as good battery life, I know from past research that the Sansa Clips are some of the best sounding MP3 players on the market. The stock firmware supports the usual mp3 and apple malarkey but more importantly both OGG Vorbis and FLAC files, both of which are necessary for me. Long story short, it sounds a hell of a lot better than my old D2. Its able to amplify the whole sound stage at max (and above max, we'll touch on that in a second) far better than my D2 did. As I mainly use it in conjunction with my car stereo and I'd rather not crank my amp's sensitivity, this is very important. Interestingly, while it supports microSD/SDHC, it does not support microSDXC, odd as you'd think SanDisk, a flash memory company, would like to spread support for it but I digress. While the D2 has some fancy sound enhancements (BBE guff) and does sound pretty good, the Clip Zip is better at stock settings and I no longer find the BBE guff necessary.
     
    However, there is a fatal flaw with the Clip Zip. The stock firmware is rubbish. Browsing music is alright, but the interface isn't the smoothest. It didn't display any album art for me and the minimum display timer is 15 seconds, so if you skip a lot of tracks like I do, the battery dies well below its rated 15 hour mark (for comparisons sake my D2 was rated for 52 hours, but even COWON admitted that was stretching it). "But Blithering Div!" you yell and waggle your fingers about, "You just told me that its the best mp3 player period." Yes I did and that is because there is a saving grace. Those of you who tinkered with old iPods already know where this is going.
     
    See, there is an open source mp3 player firmware out there known as Rockbox. What Rockbox does, is take this lowly little Clip Zip and turns it into a music munching powerhouse. It literally supports every codec I can think of, from mp3 to AAC to Wavpack to APE to sid files and xm modules for chiptunes. I'm not sure if its implemented yet but midi file playback was in the works as well. Why? Fuzzed if I know but I'll take it. I now have longer battery life too thanks to being able to set the display timer to a integer lower than bloody 15 seconds. I can "overdrive" the amp to drown out idiots when I'm wearing headphones, its shuffle feature actually shuffles unlike my D2 and you can adjust the dynamic range (aka normalize) your music if you so please. And if you're super lazy like me and have some 24 bit audio floating around, it'll happily downsample during playback. It separates its file browser from the music "database" as well. Presumably so should you have a file that requires a plugin to playback, you can actually get to it. It does have some caveats like the control scheme. I've been using it for a few months and I'm still not used to it, but given I've used my D2 since 2008, that could be why. Another one is that the database includes "The" in the metadata so The Distance and The Seatbelts are in the same section rather than in D and S respectively.
     
    All in all, this is the best music player when you factor in that for the 8 gig player and a 32 gig microSD card will set you back ~$83 if you get the UHC class microSD card like I did because I like instant gratification fast transfer speeds.
     
     
    For the TL:DR's Clicky
  13. A Blithering Div
    I've known this for quite some time but I've gotten pretty good at repressing things. Sometime in December or whatever my brother's GF is pumping out another unit. I can't really describe it in any other way except with a sarcastic great... Now I have another germ bucket to deal with. Another, yelling, whining, snot bubbling, incoherent, dim witted, bug eyed, downright dirty "thing" to deal with and I know I'll have to deal with it. After all, its family. Can't exactly exile them (for the most part but that's a different side and story all together). It's gonna be tainting my air. GAHHH. The only upside I see is when it grows up enough to blindly follow what I tell it do to. A little return on the "brotherly love" if you will. Outside of that, I can only hope that I'll never have to deal with its crap both rhetorically and literally.
     
    As you may have picked up on, I hate children. Always have and always will. I've been told numerous times that if I find the elusive one, then someday kids. And that I was a kid once. Its different when they're yours etcetera etcetera. I'm glad to report that all of that is bullshit. A. If she really is the one, then wouldn't it be kinda obvious that she too hates children? Something of this nature isn't exactly negotiable. B. I'm aware of that and have profusely apologized to every relative I talk to about that. I know how annoying I can be. Can't imagine it being any better when I was younger. And C. No, it isn't. I know me. There needn't be more of me on this planet. I'm only 21 and I'm saying "damn kids these days" and that is not something that goes away with age. I don't have the patience with other people's children and its supposed to be different when I can "discipline" them? The last thing I'll ever need is a kid crying because all of its needs are met.
×
×
  • Create New...