Articles tagged 'pc'

Mac Daddy Sep 24 2006

So I’ve recently got a MacBook Pro, and it’s fucking awesome. For a start, it’s quick, and secondly, I’m finally, after numerous attempts in the past, enjoying the Mac OS X experience. I’m not sure what it is this time around, but I think working on a Mac just feels right in laptop form. I’m running Parallels for all my Windows needs (damn you, VS2005 and C#), and while I’m tempted to also give Boot Camp a go for the gaming aspects (have you seen Company of Heroes? and apparently HL2 runs real nice on the MBP too!), for now I’m sticking with the virtualisation solution.

But I’m also learning TextMate, and man oh man is that a sweet program to use. I’m a little overwhelmed by all the add-ins (sorry, bundles) and shortcuts there are, but at least it’s prettier than emacs :-)

Comments

Hi-Def: highly, and definitely, confusing Jun 17 2006

So I’ve been toying with this post in my head for a few days. I recently bought a Samsung HD Ready 32" LCD TV, and so a week or two ago I decided to get my Mac Mini hooked up to it, using Front Row as a super-duper media center. I soon realised that while the TV touts HDMI/DVI connectors, it really just means HDMI. I’m well aware that essentially, HDMI and DVI are the same signal with different connectors. Why we need two connectors I’m not sure, but certainly DVI-D (completely digital signal) and HDMI are one and the same and so a DVI output from a PC into an HDMI connector will result in a totally digital signal from start to finish.

Now, after my initial disappointment that there wasn’t in fact a DVI connector on the back of the TV, I stuck with VGA (it has a PC input too) until I researched cables and connectors further. I assumed therefore that running using VGA, that HD output from Mac Mini to TV wasn’t possible. I skipped over the fact that the TV picked up on the PC signal, with the Mac switching resolutions to the TV’s native 1366x768 (720p). It didn’t dawn on me that HD could still be possible, as I assumed it needed a digital, not analogue connection.

So onward and upwards… after laughing so hard I thought I was going to break my spleen at the idea of 60 (just over $100 for those across the pond) to buy a gold connector DVI->HDMI cable, I was thinking I’d just give up on the idea, til a trip to eBay came up trumps. 8 (around $14) for a gold connector 2m cable for running from a DVI source to an HDMI connector on a TV. However upon plugging this in and hooking it up to the Mac, it became apparent that the TV and Mac had decided to stop communicating, instead deciding to only offer me some default resolutions (1280x768, 720p, and 1924x1344 or something, equivalent to 1080i). I’ve read that the Intel Mac Mini’s now allow slightly non-default resolutions in order to cater for the native resolutions of TV’s, however I’m out of luck with my PPC Mini (I’ve read about various display tools to achieve the same thing, but am well aware that the wrong settings, especially when talking about refresh rates, can break monitors and TVs).

Long story short, the HD space is crowded with confusion. I’m still not 100% on my thesis that HD will work over VGA or DVI, because its simply based on resolution, so whatever is supported by the TV and output source,will determine if HD is a possibility. I’m still currently using the DVI->HDMI cable, running at 1280x768 (720p), however am not sure if the VGA might provide better quality. And of course, when my Sky HD installation eventually happens (now delayed a further 11 days because of a shortage of boxes), it’ll require the HDMI input, and so I’ll be out of luck unless I buy a switch.

I intended to keep this post shorter than I have, but that pretty much sums up my experience with the integration between HD capable computers and TVs - after all of this, I discovered while playing the test H.264 HD QuickTime videos available from Apple’s website that really my PPC Mac Mini isn’t powerful enough to play HD content anyway - sometimes it just about struggles through, often it chugs and stops and starts making the video, while incredibly detailed, totally unwatchable. I’m now thinking of either a newer Mac Mini, or perhaps even a Windows based media center, more on this later.

It should be said though that Front Row is awesome, and with the Mac also hooked up to my home theatre, I have a simple, elegant way to access and play all of my music collection, as well as the movies, TV shows etc I have stored. It looks truly beautiful on a 32" widescreen TV too.


Technorati Tags: , , , , , , , ,

Comments

Ooh, embarassing Apr 10 2006

Hows that for cross-browser compliance? Just realised (thanks to my bro) that when viewing this very blog using Internet Explorer it appears it was asking users to download the page as a file. This will be down to the work I was doing on accept types and content types a while back, and it must have snuck by my rigourous multi-browser regression testing system ;-)

Anyway, one quick fix later and even IE users can see the content now :-)


Technorati Tags: , , , , ,

Comments

Building blocks Apr 10 2006

So before I get cracking with a few ideas I’ve got going on, I wanted to get my base framework and build environment sorted. After taking xFramework offline and turning all of the useful code I wrote into a more condensed, useful set of libraries, I then decided to work on a few build tools.

First up was to refine the unit-testing libs and console tool that was part of xFramework. Re-named, and re-built, I made it leaner and faster. Previously, there was one core library - this was the same one the console tool used to execute tests, and the client library containing the unit-tests referenced to tag unit-test classes. Now, it’s two separate libraries - a client framework lib referenced to mark up unit-test classes with attributes, and a console tool with core library (and a reference to the framework lib) to execute unit-tests themselves.

Secondly, I wanted a build server. I searched a while back, and came up short. There are plenty out there, but the criteria I’m after is:

  • Automated build setup, with an easy-to-use configuration structure
  • Plugin interface for a totally extensible build and reporting process
  • Cross-platform, runs on both Linux and Windows, and preferably Mac OS X too
This is the basics of what I wanted, but it’s a start. Anyway, I decided a while back I’d write my own, and got a way through it before more important stuff took precedence. This time, I decided to stick at and get it done. I started from scratch, only referring to the code I wrote previously for a couple of things, more as a source of comparison. I now have a fully extensible build system, which so far has the ability to:
  • Checkout a working copy from a Subversion repository, either using the “file” or the “svn+ssh” protocol. I’ll write further SCM support as and when I need it.
  • Build the project, so far the only supported plugin is an MSBuild plugin, for building MSBuild project files such as Visual Studio 2005 solutions and projects (of course the API should be fairly similar to the XBuild tool for Mono, which was a port of MSBuild, and I’m thinking about a NAnt provider somewhere down the line)
  • Run some code metrics, so far just a simple line counter for each file meeting the configured criteria, and a total line count
  • Run unit-tests on the code, using my unit-testing framework, and a bridging plugin for the build system
  • Generate some documentation, using my own documentation engine (see below)
  • Cleanup after itself
All of this can be done using a series of plugin schedulers, also extendable to provide further scheduling options, however currently providing hourly and daily schedulers, aswell as a run once scheduler (runs immediately, once only) and a continuous build (building over and over) for testing. Eventually I plan to write a scheduler that queries the source code system for updates, thereby turning it into a continuous integration build system too.

Now that this is done, and tested, there are a few other plugins I want to write for it, but on the whole it’s good enough for me to develop with now. The build reports are also extendable - an interface is defined so that plugins can be written to deal with the build report, and so far I’ve written one provider that saves this report to file, and one that mails it to me, both of which I use. This means I now have my project building nightly, with everything automated, and me being notified of the results.

The third and final tool I’ve written, is a documentation engine. This came about purely because I tried writing an NDoc documentation plugin for my build system, to no avail (it didn’t like my .Net 2.0 assemblies). I browsed around, and found two things: 1) there is little or no activity currently occurring on the NDoc project, and 2) a few users are starting to modify the source themselves to provide a source distribution of NDoc with support for .Net 2.0. I took one look at the Xml documentation files that the C# compiler outputs (the ones that NDoc uses) and decided instead of using unsupportable, possibly unreliable user hacks to get my documentation needs accomplished, I’d write my own little engine, and in this way I’d get exactly what I needed. And so my third development tool in my toolchain, a core library defining a plugin interface for document generation, was born. It allows project files to be configured (defining the Xml document input files to use), and it allows plugin documentation generators that handle the actual output - I wrote one simple plugin to provide Html output for now, more advanced output could be Linux style man files, a Windows style help system, MSDN docs, or a multi-page Html documentation web site solution (currently it chews out one Html file, using JavaScript to allow expanding/collapsing of elements). I have written a console tool to build documentation from the project file, and of course the build server plugin is used to generate documentation directly from within the automated build process.

I now have an end-to-end build process, which, while some tweaking and a bit more code writing is necessary, does the job more than adequately for me. I feel far more confident going on to build upon a lot of my ideas, knowing I have the building blocks in place to write and develop applications more reliably.


Technorati Tags: , , , , , , , , , , ,

Comments

Sinful Apr 6 2006

I see that SiN Episode 1: Emergence has a release date now, and you can pre-order it using Steam, and start pre-loading (like with Half-Life 2). This will mean as soon as it’s released you can just unlock it, and play!

I pre-ordered my copy just now, you save a couple of bucks by doing so, and you get the original SiN/SiN multiplayer to boot! I’m looking forward to giving the original run-out again, I haven’t played many games recently and I’m thinking a straight-forward shooter might be just what I’m looking for :-) Plus, the new SiN looks amazing on the source engine, check out some of these shots for proof!


Technorati Tags: , , , , , , , ,

Comments

Gotta love April Fools Apr 1 2006

So another April Fools day, another Google joke. I still think the best April Fools prank that Google could pull would be to actually pick one of these bogus ideas they come up with every year, and try to make it stick. The Lunar base, the MentalPlex, or Google Romance - come April 2nd, I want to see it still proudly displayed as part of the Google portfolio. I mean after all, it’s Google we’re talking about, of course it’d work.

But my favourite April Fools joke I’ve seen today is GameSpots excellent and totally serious feature on the best Final Fantasy games. The top ten Final Fantasy games of all time to be exact. Including some games that have nothing to do with Final Fantasy, as well as counting Final Fantasy VIII twice because it’s so awesome, it’s definitely a definitive article for the FF fan. My vote for the best FF game of all time? RF Online. It’s more Final Fantasy than FFXI ever could be :-)

Comments

Page 1 of 1 |