Thursday, March 30, 2006

Things you probably did('nt) GNU

A cool summary of how to deal with all things GNU/Linux for the total "newbie" is presented by the ever-informant Dominic. It's a nice summary of how to get things up and working, and what do to when those thing

Speaking of things not working, I've once again decided to go headon with the possible solution to running COBOL code on GNU/Linux. Failed. Miserably. I've tried compiling and running TinyCobol and OpenCobol, which both threw errors. I'll try hiting whatever forums/support those projects have and hope for the best.

Tuesday, March 28, 2006

Tidbits and tidbytes

Actually not much computer related, to be honest

In a (sort of) response/rebuttal to the whole Mohammad Danish cartoons fiasco, "Dimona Comics" have announced an "Israeli Anti-Semitic Cartoons Contest". One of the judges will be none other than the semi legendary Art Spiegelman, also known as the creator of Maus, one of comics' seminal masterpieces.

Ronald Moore, producer of the current incarnation of Battlestar Galactica (a.k.a. the gift that keeps on giving), said some stuff about the show, video games and whatever else he had on his mind.

I'm a fan of The Prisoner, no secret there, so here are scans of three "novelisations" and a Jack Kirby The Prisoner comics.

The above two links are courtesy of the Website at the End of the Universe, which also obituaries the passing away of Science Fiction writer Stanislav Lem.

Finally, some notes on how to finally get rid of that mind-numbing, productivity-stamping, over-rated, user-unfriendly computer peripheral, the mouse.

Sunday, March 26, 2006

Oh well...

No, your other left!

I believe in equilibrium.

This goes to say that whenever something tilts to one (usually wrong) side, eventually it will be tilted to the other side, balancing things right. For example, more often than not, someone releases some research that claims the opposite of what we all know. I'm not referring to Microsoft Get-The-FUD campaigns and their ilk, but to a third-party (usually a respected one) that announces, that something isn't the way you always thought it to be, only to return on their claims several months later (sadly, after the damage has already been done).

For example, several months ago, Symantec published an article claiming that Firefox has twice the amount of security flaws than Micrsoft's Internet Explorer. I have claimed then, and still stand by it, that being Open-Sourced, Firefox not only is not twice as insecure, but can deal with those flaws much faster and more efficiently, while Microsoft either ignore or refuse to recognise flaws (which exist nonetheless), until they release a fix for them, which can take as much as several months.

This been said, lately, Symantec had a change of heart, now counting "vendor- and non-vendor-confirmed flaws", clearly showing that Firefox is the more secure of the two.

Another recent example comes in this article regarding Nature.com's comparison of the Britannica and everyone's favourite encyclopedic punch bag, Wikipedia. To refresh the memory, Nature.com conducted a review of 50 articles from each publication, and found, to their surprise, 30% more errors in Britannica than in Wikipedia. A short debate erupted, some claimed the triumph of the "common intelligence" over the old-fashioned academic one, some wondered what was the basis for error (for instance, what the hell is Wagnerian Rock?) but the sad writing was apparent on everyone's wall.

Not really.
You see, it appears that Nature.com "cooked" their research. Cooked, I said? More like steamed, stewed, roasted, soaked overnight, mashed, baked, fried, boiled, and brewed, as, according to the Register, "Nature sent only misleading fragments of some Britannica articles to the reviewers, sent extracts of the children's version and Britannica's "book of the year" to others, and in one case, simply stitched together bits from different articles and inserted its own material, passing it off as a single Britannica entry."
And they could only find 33% more errors? This doesn't bodes well for Wikipedia. But then again, those mashed-up examples that Nature.com used for the Britannica are more-or-less how Wikipedia articles are created.

Saturday, March 25, 2006

Get on with the program

The Register has an article about COBOL. It's mostly "COBOL is still alive!" and "You can still use COBOL" stuff. Nothing really interesting if you're actually working with it, and keep updated, but I assume this is shocking or even horrifying news to people who think the world of computers starts and ends with AJAX.

Speaking of ancient languages, here's something really bizzare. This blog has a neat-o post about Doctor Who special effects, including a detailed review of some of the code seen here and there on the TARDIS screen. Turns out that after writing the program for all those "high-tech" grapics seen on the TARDIS' screen, they actually displayed the BASIC and Assembly code on said screens. It's quirky, but then again, wasn't the Terminator running on DOS?

As a side note, the world of gaming seems to be going, no, make that spiraling down, and fast. First came the announcement about a video game based on "Desperate Housewives", and then came this Pokemon penis, er, stylus.

Thursday, March 23, 2006

More of Moore

I've thought about my previous post regarding Alan Moore and the movie version of V for Vendetta. I believe some clarifications are in order.

First, I kept referring to Moore's cinematic writing, which might seem to be inappropriate, as we tend to think of the art - the drawing - as the cinematic part. This would've been correct, had Moore's writing style been different. An Alan Moore script includes everything we will see: the position of the characters in the panel, the POV, what the characters do, what they look like when they do it; making the artist less of a creator and more of a deployer.

For example, this opening panel from the (unpublished) Nightjar script:

"Two tiers of three frames each with all the frames the same size, then a narrow strip along the bottom with frame seven quite small and frame eight being the title logo. If you've got a better idea then please don't feel intimidated by all this junk - just go ahead and do what you like.

First of the six flash-back frames that form the opening sequence. I'm still quite fond of the idea of maybe using some different medium for these first few panels to give them a different look. If you're doing the rest of the strip using a half-tone maybe you could do these frames in pencil? Just a thought... This first frame shows a view of an overgrown and untended terraced garden, looking towards the peeling back door which is opening towards us showing a rectangle of darkness within. Someone unseen is opening the door from inside - we can see his fingers clasped round the edge of it. The garden is deathly still, maybe just a couple of insects droning somewhere. There's junk everywhere - bricks, pram wheels, tin bath, plant pots - I want to give the impression of a frozen instant, like something out of an old photograph album. As an almost subliminal detail there is a small bird swooping low over the garden, it's shadow falling neatly beneath it. We have caught it at one split instant of it's flight. It'll be gone by next frame.
"

So yes, we are basically talking about much more than just the writing/narrating/plotting. This style places the writer in the role of scriptwriter, director, and actor, while the artist is the cameraman, set designer and "puppeteer" of the actors. It's a writing style which has become synonymous with Moore, and at times is even referred to by name.


Second, I am well aware that movies are not made for the sole reason of realising a story in another media. I mean, I'm sure Kubrik's main reason for creating "The Shining" was to reimagine Stephen King's horror story in cinematic tools, but to his producer, and that producer's boss and so on, the main reason was "Successful book by famous writer + famous director = Mucho $$". Movie licenses cost money, and a director/writer/producer can stand in a Hollywood studio and scream till he's blue in the face about the innovative qualities of the original piece and the cinematic breakthrough that the adoption will be, but the guys with the money only care about whether the original's name will sell tickets, and whether the adoption will be marketable enough for them to shill out the dough for the license.

This is partly the reason why Alan Moore's Magnum Opus, The Watchmen, still rolls around unproduced. A complex book that always been more critically than commerially acclaimed, and no real way to make it into a 90-120 minutes movie without losing either the comic fans (for being unfaithful to the original), or the movie-goers (for being too complex and heavy), or, more likely, losing both.
I think this is also the reason why V for Vendetta was made into a movie. I've no idea on how successful was the comic book, but the movie equation is golden with script written by the Wachowski brothers (The Matrix), and featuring Natalie Portman. Add a dark, gothic atmosphere and recent success of comic-book movies, and you have a seller. Sadly, not the seller I'm looking for.


Finally, I think my point regarding the cinematic qualities of V for Vendetta being the exact element that would make it a bad movie wasn't explained fully. I'll leave the overall concept, which was detailed in the previous post, alone, and focus on the key elements that "make" the comic book and would break the movie.

V for Vendetta has a very strict format. The story was seperated into 3 volumes, which are made of 3-4 issues, each issue split into 3 episodes. There are visual - monochrome, full page - separators between every episode, issue and volume, which help set the tone, like some grave, monolithic bookends. The plot develops accordingly, the internal and external narrative progresses accordingly. This almost beckons a movie trilogy, separated into acts and scenes in correlation with the segmentation of the comics. Any attempt to make it into one 2 hour movie forces some serious cuts to be made, damaging the story.

The panel narrative is built by short action scenes. Character enters room, waking woman, CUT to detectives brainstorming on computer, CUT to character and woman talking, CUT to police officer being informed, CUT to detectives talking, CUT to character saying goodbye to woman, leaves room and is stopped by policeman. (I'm leaving the actual details to limit the spoilers). The pacing of the panels, their positions, how many panels for each scene, etc. Those were created with the idea that the action will be read and viewed by a comic book reader, not on a screen. To adopt this to a movie would demand two things; either maintain the pacing of the action, and kill the scene, or ignore the original and "reimagine" the action (which would probably kill the scene...)

The comic book format offers some unexpected advantages. It's a silent film, and Moore takes full advantage of it. There is no background music to dictate the tone, for instance, but more than that, there is only as much "sound" as the writer allows us to experience. The above scene ends with the death of one of the characters, and the whole fight between the two is fast and silent (no sound effects like "Pow!" or "Wham!" etc.), the dying character looks at us with what is obviously a terrified shriek, made even more terrifying by the lack of sound. Just think of watching the same scene in the movie, and there is no sound. No music, no voices, nothing. Wouldn't have the same effect. On the contrary.

Superman's origin has been told about 4-5 times already (not including the changes forced by the Crisis and Zero Hour); Batman's origin was told about 3-4 times as well. The X-men were "rebooted" about 3 times, not including the "ultimate" version and similar projects. All of DC/Marvel characters have a history full of retold, retconned, reimagined, rebooted and rephrased stories told over and over again by dozens of different writers and drawn by dozens of different artists. In this view, the movies are just another link in the chain.
V for Vendetta was told once. Start to finish. It was drawn once. There was only one artist drawing it. This isn't "yet another version". So why make it?

There is an ounce of vanity to creating such a movie. A writer/director taking on themselves to create such a movie don't usually think "I will humbly deliver the genious of Alan Moore to the masses", but rather "I love this story, bet I can make a hell of a movie out of it", which roughly translate to "I can do this better". This doesn't fit the Wachowskis, whose breakthrough project was a philosophicalhilospohical mash-up full of narrative holes (which they attempted to plug with CGI effects and martial arts scenes) that failed to hold one movie, not to mention three. Any shred of actual, solid, cinematic quality was thrown to the four corners of the earth with the two bloated sequels which did nothing, told nothing, went nowhere, and cost gazillion of dollars to do it. There are some good directors/writers who are probably more capable of "doing it better". Terri Gilliam comes to mind. Tim Burton, perhaps. Johnny Depp as V would probably be perfect, and might give enough of his own interpretation to make us forget the original V, one of comic books greatest anti-heroes. Just for reference, Evey, a 16-year-old girl, is played by the 25-year-old Natalie Portman. There's a scene where Evey is supposed to fool another character into thinking she is less than 15. I just don't see that happen here.

R for Redundant

A lot of hoopla has been thrown around due to the latest release of the V for Vendetta movie, based on Alan Moore's comics by the same name.

I've not seen the movie, so I can't really commend about it (let's just say that fans of Moore, and the writer himself are Not Pleased), but I think that Alan Moore's works would, in general, make a lousy movie, as they are very cinematic.

I know this sounds like arguing for the sake of argument, but there's a very simple logic behind this one. Take any Alan Moore comic, and you'll realise what I mean by "cinematic". The quasi-camera movements of the POV (the sidewalk to top floor zoomout at the start of Watchmen comes to mind as a good example, as well as the train scene from the first V for Vendetta), the way characters move around the panels, etc. It's all very "cinematic" and makes the panels almost spring out of the pages and come to life, as if you're watching a movie instead of reading a comic. And for that reason, it will never work as a true, live-action or animated, movie.

If it's still not clear, the whole concept of Moore's writing revolves around making the comics itself an engrossing, moving, vivid, immersing and, in general, cinematic. This works in the confinements of the media, i.e. a printed comic book. Taking this, and presenting it in a movie just doesn't work. If anything, it's redundant. V for Vendetta's plot is spread across 10 issues, dictating the narrative development and the pace of the story. This isn't a Batman movie, based on a comic that is 70 years old with a thousand issues to its name, but a short, concise, and self-containing story, created with the format in mind, and for the format. Transposing it to a 2 hour movie would mean crippling it, removing all that is good about it, and hanging it to dry. It might be a great movie, but it will be a great movie despite being a poor representation of the original material (not that it's a bad thing, Kubrik's The Shining basically butchered the Stephen King book, but was a masterpiece nonetheless).

There's also been rumours that the Watchmen movie license is rolling around in Hollywood, looking for someone to take it and make a movie out of it. Here's to hoping it will never find one.

Freedom fries

A small ripple in the pond was made by French lower parliament decision to ban DRM, forcing every music playing gadget to be allowed to play every digital music format, and vice-versa. Apple Computers, probably the biggest company to "suffer" from this legislation was quick to respond.

What argument did they respond with? FUD.

Just to push a point here, Apple's have no problem with other formats being played on their iPods (which supports mp3 and other non-DRM formats), it's that their proprietary 'Freeplay' (snigger) files would be available to play on other machines. How does that become "state-sponsored piracy", you ask? Simple, once the file-format is open, you would be able to take the music file you bought from Apple and play it on every other machine, thus encouraging you to buy more music files from Apple... No, wait...

Let's try again. Today, if you want to play your iTunes file at anything else but an iPod, you need to 'crack' it with some third-party (illegal) software, which is a bit of a hassle and tussle, making it much easier to just illegally download the file in the first place instead of buying it from... Rats.

I can go on forever. But it's not the point. Here is what Apple is really saying:

Allowing iTunes files to be played on other machines would mean people don't have to buy iPods to play music they downloaded from Apple.

That's it. All the "state-sponsored piracy" bullcrap is just that.

Shogi the money

I've found this image at Kotaku. It links to some Nintendo seminar, which I've no idea, but the image is tres cool.

I'm not much of a Shogi fan, despite my interest in Chinese Chess. Main reason for this is that while Chinese Chess (Xiang-Qi) contains similar thematic elements to Western (FIDE) Chess, Shogi is a distant cousin, with less similarities for my taste. I've included a couple of Shogi pieces (or a variant of) in one of the variants I created, but simply due to those pieces moves, rather than their function in Shogi.

Wednesday, March 22, 2006

Coincidence? Or is it...

About 2 days ago, came the official announcement that Ubuntu's next version, 6.04 ("Dapper Drake") will be delayed until June 1 (and will be renamed "6.06" as well).

A day later, Microsoft came out of the blue and announced that Windows "Vista" will only be released at January 1st, and not in late 2006 as they previously hinted.

Hmm...

Isaac Haze?

I had the laugh of my life reading about Hayes' leaving SouthPark over their misrepresentation of Scientology, but it seems we haven't heard the last of it, as according to FoxNews, Hayes did not quit the show. On the other hand, it seems that SouthPark's creators want to get even with Hayes for leaving the show, by doing an episode that will parodise Chef, Hayes' SouthPark character.

Has my ears gotten insane? So what is it?
Stay Tuned! (always wanted to say that)

Monday, March 20, 2006

Is that a slash in your dot?

Or are you just happy to... Err... Let's not go there, shall we?

Some recent Slashdotting.

Most "main-stream" GNU/Linux distros nowadays go with the "pre-compiled packages" solution to software installation. This goes against the compile from source concept which is, for many GNU/Linux purists, the whole heart and soul of the whole GNU/Linux concept (or the bread-and-butter of it, pick your choice), as the actual compiling method allows the user to configure and customise the software to his needs, preferences and optimal system compatibility. This article tests both options.

One of the Mars rovers, Spirit, has lost one of it's 6 wheels. Not bad for something that wasn't supposed to last more than 3 months, now well into its second year, or to quote NASA "two years into its 90-day mission" Much like the Star Trek franchise which after almost 40 years into its 5 year mission has lost all the brakes.

Speaking of Open Source, the Economist runs a (somewhat clueless, it seems) article regarding open source projects, citing MySQL, Firefox and, well, Wikipedia as examples of a method of which "Its advantage is that anyone can contribute; the drawback is that sometimes just about anyone does." Which serves as a lesson to kids everywhere that writing under the influence of drugs is not a good habit.
Hehe. Sorry.
I think the biggest confusion here is by bundling open-source software projects with the "communal-edited" Wikipedia. Open-source projects are not chaotic, anarchic, or "contributed by anyone". These projects have a maintaining body, which has the final word on what goes in and what not. Contributions are welcomed, but do not immediately become part of the product, even when the nature of the contribution is a free-for-all one. For example, while anyone can create a Firefox extension, those are not available as an integral part of the downloaded software (i.e. "The product"), but are presented on a separate "use at your own risk" basis. Other products, like the Debian Distro, goes even further and restrict the software packages submitting to authorised maintainers. This is why the Wikipedia concept is not a good example of Open-Source or, to quote the NYTimes, "Anonymous Source Is Not the Same as Open Source." The "everyone can do anything" method is just not the same as open-source development, not by far.

And, a double-duo couple of two Microsoft issues:
An analysis of .NET usage in vista shows that "Vista has no services implemented in .NET". Always nice to see a company backing up their own technology. Almost makes one wonder what are the Mono guys doing supporting this framework while its creators prefer running native code to utilising the .NET framework. Once again, it seems developers and companies are falling for Microsoft PR rather than the simple reality.
Which speaks volumes for the next article, claiming "Windows Vista's tough approach to spyware may put anti-spyware companies out of business". But seriously, folks. I'll believe it when I see it. Marketing your yet-to-be-released product as "100% spyware proof" to a point where it will cause anti-spyware companies to go out of business just don't cut it in my book. Remember Bill Gates announcement at 2004 that "Two years from now, spam will be solved"? Remember Gates claiming, this year, that Microsoft has, true to its word, eliminated spam? I just hope their concept of "eliminating spyware" isn't as fuzzy as their concept of spam-removing.

Friday, March 17, 2006

Links from the Planet

Ubuntu Planet that is.

Mark Shuttleworth apparently made his money from self digestion, and not from killing every other GNU/Linux distro or by eating babies.

Solve a bug, get a hug? Wonders will never cease.

Just when you thought you got the latest buzzword sorted out, comes a new definition of LAMP.

And on the other side, there seems to be some rumble among the communitee regarding LaunchPad, Canonical's bug-tracking, translation and project co-ordination service. This guy adds some oil to the fire.

Tuesday, March 14, 2006

Isaac Hayes: "Isaac Hayes has a problem"

Just read.

Monday, March 13, 2006

The root of all security

Anyone who reads the Tech news regularly have probably heard about the latest announcement that a flaw in the Ubuntu 5.10 installation process leaves the default user password saved as plain text on the harddrive. As Ubuntu doesn't (by default) allow for a root password to be created and instead gives the first user root abilities by way of 'sudo', this means the de-facto root password is left in plain view as plain text (pun not intended, at least not originally...), allowing other users access to the First User password.

However...
No, scratch that, I'll however in the next paragraph.
What's most surprising here isn't that Ubuntu has a security flaw, software isn't perfect, and since every new version of Ubuntu have rewrote the installer, installation flaws might appear. Actually, what's supervising here is that it took so long to locate it. A lot of the open source "evangelists" claim that those kind of bugs tend to surface more easily due to the large amount of users/developers, the access to the code, the better method of communication, the community etc. etc. This didn't happen here. Until now.

What did happen is that this was patched in hours. What did happen is that this doesn't allow outside users to have root access, only local users (and users connected remotely through SSH). It also emphasised the fact that security practices are still the best way to ensure that your system is safe from outside attacks. Meaning, users who installed the OS through the "expert" mode, and have created a root password were not in any danger, or users that following the installation have enabled root, or a root-like user (meaning creating a second user that has the sudo-root privileges , and making the First User a limited, non-sudo user) wouldn't have been compromised by this flaw.

Also interesting is the question regarding the whole sudo model. Ubuntu's decision to use sudo instead of root has brought many complaints from veteran Gnu/Linux users. Many people feel that this practice compromise the system's inherent security model and is a very good example how Ubuntu, in its attempt to be more "accessible" broke the security model.

I don't subscribe to this concept. The danger of working with a root terminal are very known, as the user might not close the terminal, or logout from root, after completing the operation. With the sudo model, leaving open a superuser terminal minimizes this by forcing you to enter a password for each root operation. Adding a second layer of distancing the default user from the root operations, by creating an "admin" user with sudo privileges is even better than the normal user/root model, as logging into "admin" would still demand the sudo password to be entered, and forgetting to close the "admin" terminal won't compromise the system.

In a similar note, I wrote in the past regarding the faulty concept that a products security is measured by the number of officially disclosed flaws. I'm happy to see that there seem to be some changes in this way of thinking.

Saturday, March 11, 2006

Dapper Drake Delayed? Decisive Debate Devolves.

Looks like I might not get a birthday present from Canonical next month, as Mark Shuttleworth, company owner and Ubuntu Benevolent Dictator for Life, said that he want s the distro to be delayed for six weeks for "additional validation, certification, localisation, and polish".

My opinion? Great news. The current version of the distro does everything I need and more, so it's not something that burns in my bones. I don't mind holding on for an early June release. I mentioned before how profound were the improvements in the current version (5.10 a.k.a. "Breezy Badger) over the 5.04 version (Hoary Hedgehog), and this alone makes me very, very excited towards the next release (6.04, Dapper Drake, although I guess it'll be called 6.06 now...).

Actually, the biggest issue of concern is that Ubuntu and Canonical have tried to differentiate themselves from other non-commercial GNU/Linux distribution by offering a quasi-commercial development cycle, i.e. new release every 6 months. This delay might be the first crack in the wall of excellence Ubuntu has been building around itself. Only, to be honest, I doubt it would do any actual damage.

I've previously posted my concerns that it seems the developers are concerning more on bells, whistles and gongs rather than refining the software, but it seems their hearts and minds are in the right place. For me, it's a Good Thing. I'll wait the extra weeks for something that would make my (already excellent) Ubuntu experience better.

Friday, March 10, 2006

Reclaim your office

I've been bashing OpenOffice.org in some of my posts, so I guess I should set things straight.

I've been using the office suite for several years now, and exclusively for more than 2 years. I've done all my writings on it, use it for spreadsheets and presentations, the whole bunch. In fact, my current Windows partition doesn't even have MS Office. It's not as fast as I would wish, but that's a long way from being a bad application, which it isn't. It is currently a fully usable, and includes everything I need for my daily work, either at home or at the office.

In this note, OpenOffice.org (when will the decide to drop the oh-so-1998 '.org' from their name?) has just gone to version 2.0.2.

New Technology Journalism

Found this at the Inquirer. It seems Nvidia's second editors rule is very simply "No leaks to the Inquirer". Too late for that, eh?

Thursday, March 09, 2006

I fart in your art!

Just a quick one, Evil Avatar pointed to an article in 4 Colour Rebellion which pointed to an article in Joystiq about Shigero Miyamoto being Knighted in France along with 2 other game designers.

Of course Joystiq couldn't leave well enough alone, and I quote: "With several OBEs awarded in Britain to game developers, it seems that games are becoming recognised as art--enough to deserve national honours, at least"

And there you have it! In the same way Army generals are being knighted means that war is art, football managers knighted to mean that football is recognised as art, and scientists being knighted means that science is art.
Or in other words, I would love to know how did the cunning minds in Joystiq reach that conclusion. Industries key figures have been knighted in many fields, it mostly means they have contributed to their field, or industry, in a way that is either profitable, or honorable (or both) to the country, or in a way that is extremely remarkable (or just for good PR...). Knighting has nothing with being recognised as art, sorry.

It's clobberin' time! Oi vey...

A fun post to the Website at the End of the Universe shows that the Fantastic Four Thing is Jewish (proof here). It also links to a WikiPedia list of Jewish, or Jewish born superheroes and comic books characters. As always, with WikiPedia, there just HAS to be the odd item that reminds us all how unreliable WikiPedia is. In this case, one Phantom Stranger, tagged as "possibly Jewish", which is basically like saying the Christian god (all three of him) is "possibly Jewish".

The Phantom Stranger's true origin was never fully disclosured, and he is one of the more mysterious characters in the DC universe. He's supposedly immortal, spiritual, mystical and arcane character, with god-like, yet never fully discussed, powers and nature. In Secret Origins, several writers presented their version of the Stranger's origin. One of them claimed him to be an Angel, that after Lucifer's rebellion was cast away from heaven, but were forbidden entry to hell, left to eternally wander the earth. Another one had him as the Wandering Jew, and so on. Extrapolating that this means the Stranger is Jewish... I don't think so.

In Pratchett's Discworld, magical characters, such are Wizards and Witches, are said to not believe in Gods, in the same way you don't believe in doors. They're there, and they're useful, but just not something you believe in. Attributing religion to the Phantom Stranger, in my opinion, is following a similar route, as he is a demigod in his essence, and is "beyond religion". Are the angels religious? That's an obvious one, but, was Moses religious?
I don't think so. There's a pre-requisite for being attributed as "religious", which is the blind faith in your deity. I find it hard that someone who's been talking to god on a daily basis, even arguing and openly defying said god, can be said to be religious. In Comic book terms, characters like the Phantom Stranger, or the Spectre (the embodiment of the Angel of Vengeance), cannot be said to be religious.

Wonder Woman, on the other hand, is an interesting character in this aspect. In her current format, she was created from sand (clay) by the Greek Gods, she walks with them, and was even deified at a certain point. In War of the Gods, she fought with them and against them. Despite all that, she is constantly portrayed as devoutly religious, often invoking, or praying to them. Is this a case of simply a person needing something to believe in (as many religious figures would have you believe (no pun intended), belief is a natural instinct, like love or hunger), or just a plea from one on a lower level to a higher one? Most often, Wonder Woman is invoking Gaia, Which is 2 "generations" above the Greek or Roman gods, so that might be the case. Or it might be that the act of praying is what the gods accept as a form of calling or summoning. So when a god meets someone in a pub, he doesn't leave his phone number, but praying instructions. That would be it.

eXTReMe Tracker