Monday, December 02, 2013

Deus Ex gametime

geez...this was actually going ok for me, except that when i read the walkthrough I discovered I had missed all sorts of stuff. I was enjoying the largely stealth-based gameplay.

And then I got to "the jumping game".

I don't do jumping games. Doesn't work for me--I use wireless mouse and keyboard, they are not properly responsive, and my fingers aren't either.

So we're done with that one.

Thursday, November 28, 2013

Dragon Age Origins

So this is a Bioware game, and it sucks like all the other Bioware games I've played, and this will be the last one...

The visual is good, the 3D well done, the building/terrain models good, but as always, it's the game-play that bites almost totally. The game has to be played the way the developers want it played--which is pretty much NOT the way I want to play things.

Sometimes you're solo, sometimes you have a squad. Inevitably, your squadmates will all get killed by opponents because you can't actually manage them quite right, and then you likely will get killed too.

The game is WAY to heavy on their chatty cathies and their cut-scenes (look! we made another mini-movie).

Camera control isn't what I want it to be. Can't much look "up". Can't quite "stealth" enough. Way too heavy on left-hand-keyboard/right-hand-mouse -- which I can't do: I was having RSI trouble on the right hand years ago, so I switched to lefty-mouse, and now my keyboard is different too. Not going back.

Excruciatingly linear, on micro maps.

I did not get very far into it.

Where's the delete button?

Friday, November 22, 2013

Steam and their various games

Steam is a fabulous service, the market leader.

But the products are really iffy. I have a number of games that simply do not play on my PC.

Max Payne 1
Max Payne 2
Serious Sam 1 HD

Others are wonky in one way or another, like they sort of work, but have some serious problems and halt for some reason.

Alice 2 (I reach a point where I have to use a custom item, and it simply doesn't do anything)
Batman Arkham City required a mouse that has a different kind of "middle button" than mine.
Supreme Commander 2 has some problem (I forget what was wrong here, but it wouldn't do something).

Do they not do any kind of testing? Or have some minimum testing requirement to impose on game creators to make an attempt at compatibility?

Some work just fabulous

Skyrim
HL 2
Torchlight 1/2

Given the qty of either total or partial failures I've encountered, I only buy games there when they are low-priced, under $20.

And why do so many games insist on installing yet another version of Visual C++ Runtime? or some variant of DirectX? I'm always a little nervous about this.

Wish there was a way to get the broken ones fixed. Or send them an email saying "BROKEN!"

Batman Arkham City

played some of the PC version of this...it's obviously a console port...visually quite good, possibly the best *looking* game I've played. It's obviously a console port. The controls feel too much exactly like PCGamer Mag always complained about with console ports--not really built for the mouse and keyboard, checkpoint saves

Very much of it is about the keystroke combination sequences that get the Bat Man to do the choreographed motion-animations they probably rotoscoped and re-animated from there. If you can't quite do the special things, it's just a button-masher, which is ultimately kinda boring.

Seems like every notable opponent Batman had is in this, all kinda flat, really depending on you already being very familiar with them. I'm not, of course.

And I have hit a wall. I have to fight the very first opponent who has body armor. It doesn't matter how many times I apply normal hits, it requires me to do a special move THAT MY MOUSE CANNOT DO.

So I'm done playing this one. Which is too bad, because I don't think I really got all that far along.

Moving on to Dragon Age Origins.

Some more notes on Parallel FS

OrangeFS looks kinda like what I want, but it has the usual Linux-only aspect.

http://www.orangefs.org/

The lack of real portability bothers me. I do most of my Java Dev work on OSX, 2nd-most on Windows, with Linux a distant third, mostly for concerns about portability. (I do note that that is really the reverse-order list of "able to patch the OS" behavior).

So of course in terms of what I can do for myself, it has to completely be portable across OSes, and not require any sort of "kernel patch" because there's no way I'm going to do that--just not interested.

Orange FS does have similar aspects, but at least in the doc reading I've done, they don't quite have a conceptual theme/analogy.

------

Related, but a little weird: I'm actually thinking about writing some/all of this in Lisp (ok, that'd have some portability issues, but there's no reason a Librarian couldn't be written in Lisp). I haven't done any work in Lisp in years, so that'd be kinda cool...and there are some free Lisp versions that are pretty good these days (I recently re-discovered CLisp for Windows, it's a Cygwin package). Already I can see an issue: I need a build that includes multi-threading, and the basic CLisp does not.

------

Desiderata for my Grid FS:

OS-agnostic
Can make use of any/all machines on the local net (scales)
Doesn't require mount-points for every machine (doesn't scale)
All shared space is available to any machine (scales)
Fault-tolerant about machines coming/going; system contains much self-discovery (scales)
Local apps don't really have to know much about the actual system, they're just going to interact with local files (scales)
Has some amount of redundancy
Doesn't require a bunch of special services

Wednesday, August 14, 2013

Amateur software is the bane of my life


an example: I have to parse some CSVs from another tool I didn't write. The folks who wrote it didn't use a standard CSV output generator, and thus failed to generate correct output a disturbing fraction of the time: Specifically: un-escaped embedded quoting characters. And this is in VERSION 2.0 of their software--seems like they never really examined their own output, and that no one else did either, meaning we have already parsed/ingested an unknown amount of data of now questionable validity.

I basically told them "you have to use a standard tool for this--one that YOU didn't write" so as to avoid this kind of thing, one that follows the conventions that pass for a standard for CSVs (which does not have a formal std, but does have well-known syntax).

I recommended three possible choices: Apache Commons CSV (not my fave, it's a little cumbersome), JavaCSV (my fave), and OpenCSV (fewer control options). I think they're going with Apache; I am using JavaCSV.

That is now resolved, but this is a beginner kind of error. Yeesh!

Sunday, June 23, 2013

Distributed File System, part 3

I've thought through a lot of this already, but I have not implemented much. But I have gotten started...

Wish I could put a block diagram thing in here somehow...image seems the only way, but I don't really have any.

So you want to have all the shared files available everywhere, but you sure can't keep copies everywhere, and I discussed the idea of cross-mounts, or buying truly massive storage devices, etc... none of those things are workable, really.

So what I think you do is gather the knowledge of what all the shared files are, catalog them, publish the catalog via a web-service, and then transparently copy things locally when you need to use them, and age them off later (either by a size heuristic, or a time heuristic, or LRU heuristic), making them available locally.

Depending on what's happening across your network, you could wind up with a popular file having copies actually reside a lot of places...for a while. Most files would only have two locations: primary shared, and whatever there is for standard backup,

At the moment I'm thinking of it rather like a library system.You have your own collection of files (books you acquired somewhere). You are willing to share some of them. Others are likewise willing to share some. There's a "library/librarian" service. You can ask the service what all is available (from all those willing to share--the "library" doesn't have its own repository), and you can have a copy of anything listed, until you bump into your local age-off restrictions. Remember how your local physical library works? You can look at the catalog, find something you want, check out a book for 30 days, take it home to be in your personal library, and then return it: i.e., locate a file, copy it locally for temporary use, and then delete it.

If you find yourself having age-off space problems, maybe you buy some bigger bookshelves (i.e., a new and larger disk drive).

This is not a perfect analogy, but works ok for the moment.

So there are some other storage units that could/should participate in this, and they need a proxy of sorts to do so: SAN, NAS--that sort of thing. A NAS device can be just a mountable filesystem, which suggests that perhaps the Librarian needs to take on the management of that, although that doesn't quite fit the analogy the right way: I am thinking of the file-copying as being a lot more like a P2P file-transfer system.

So there's the Librarian service(s), the local shared-publishing service, the P2P file-transferring, and the local storage management. I've written a small part of the Librarian, more of the local shared, I've been looking at file-transfer codes, and merely thought about the storage mgmt. It's all just casual so far, although it's been in the back of my mind for months. Been writing down the use cases, too.  I should have a working system in a couple of months, I think.

[Later: ok, I've put less time into it recently, so not til this fall at the earliest]

Friday, June 21, 2013

The annual V-day M/F relationships writings...

you see online...  

[this blog post was started in 2011 and then forgotten for a while]

There were several interesting ones this year [2011]. The first was from a Mormon woman, let's say late 20s, in a big city (possibly NYC, but I don't recall). She was lamenting the usual "can't find a man" situation. So of course she had some requirements that weren't being met: same religion, no pre-marital sex. IIRC, she was unhappy that guys would not stick around long; not like she wasn't a good catch: good education, good job. Eventually one of them made it clear to her: "You left nothing for us to be/do in your life" (Mormons being still a bit more traditional per historical attitudes.) In other words: you are sufficiently independent that we have no self-perceived value in the relationship--how can we be a "provider" when you don't need that?

A male comment on an entirely different story I read some weeks later put it better: "We need to be needed." When we aren't, well, it's time to leave.

So around V-day there was a story by a woman in NYC who basically said to other women: "Can't find a man? It's you, not them." It went right to the heart of things: what you say you want and what you do aren't the same. Of course there was a firestorm of comments in response. Many were a little off-target ("Why the assumption that every woman needs a man?" -- you have to wonder why those folks even read the story to begin with, and then complained, they aren't the target audience). The author was herself having this trouble, thinks NYC demographics are part of the problem (apparently there are noticeably more single women than men there), but blames herself for essentially pursuing the excitement factor and variety rather than something else.

HU sez: "don't bitch about there being no good men--if you haven't found one then that isn't what you want."

Game Philosophy

What causes a game to be successful? Do you need an Advanced Degree (tm) to figure it out?

To what extent is a game's success based on:

  1. Visuals/graphics
  2. Story
  3. Action
  4. Explorability
  5. Good AI
  6. Other

What do I mean here?

Visuals/graphics: the very best-looking games these days are things like Skyrim, CoD, etc. Fabulous 3D world to wander through. I love Skyrim (altho I think I like Oblivion better, for reasons of greater variety); visually stunning. But other games, less good, have decent graphics, too, and some interesting games have fairly limited graphics. I've replayed Total Annihilation recently (from GOG, despite my having the original install disk), and heck, that's only just barely 3D at all, it's 8-bit color, etc, and yet that doesn't matter in the end--it can still be quite difficult.

Story: Skyrim etc have pretty good stories in them. There's a main plot, and some relevant/valuable major sub-plots, and lots of little tiny things. This all works great. In fact, those little side projects work so well, I haven't even started the main plot yet, and that's after several hundred hours of game time.

Action: Quake 1-3, Unreal Tournament, etc, are all about the action. The 3D-ness of the maps is interesting, but not critical. There's no story whatsoever. For me, this makes for limited interest. Replayability is all about improving your twitch skill. I enjoy the speed and action, but the only real interesting thing about replayability is that you can do it in relatively tiny increments, like 5-10 mins.

Explorability: Half-Life 2 is a great game, but it gets a zero on this scale. It's very linear. Too linear. Dungeon Siege 1 is equally linear (well, nearly so), but you have a lot of leeway in how you play your character/team. Skyrim etc are anything BUT linear--you don't EVER have the play the main story. I like this aspect--I really don't like being locked into playing a game only one possible way, being locked into a developer's limitations--they might as well do machinima of it for you. I'm not suggesting that linearity = ease of play, it means no opportunity to meander around and look at things.

Good AI: This doesn't even apply to a wide range of games. Team Fortress 2, UT04, Quake3, etc. The AI is other humans. Alpha Centauri, otoh, is mostly AI, and can be really hard to take on.

Other: not sure what I think this is, but maybe it's something like you can find in MMO games, where you can participate without exactly being a quest player, like by being a "crafter". This doesn't interest me. I actually felt more distracted by this whole routine. DLC is a new aspect.

Think back a bit on all your games...Pong, 40 years ago, was the absolute minimalist graphics game, but was not at all easy--it was action-only, no AI, playable in tiny increments; Tank was much the same, only very slightly more complex. This was the era when graphics were super-limited. Think of other games where this is some better, but still the game has to be dominated by something else--while better-looking, Diablo is a little less about action, it seems more about the process of managing loot and such like. There's some story of sorts, but I wasn't really keeping track of that too well--despite the maps being mostly unique each play-through, it's still fairly linear.

So where is the trade-off sweet-spot? I'm sure there's a range. Could we describe it, put some bounds on it? Maybe more by example than by measurement. Reason I ask: I have developed a game or two in the distant past (known as the 70s), and have contemplated making one again, but I find myself debating what flavor I would create. Certainly it would avoid things I dislike, like the repair/crafting stuff. I'd want auto-generated maps to maximize replayability. I'd want to have some reasonable amount of action, but not where it devolves into a twitch game. I'd want some reasonable amount of story; I think I'm more story-driven than most folks. My son is more action-oriented, it seems, he can play TF2 for hours/days; he has, however, played Oblivion et al about as much as I have, HL2 more, Mass Effect, Fallout3/FNV more...he does have more time right now, but that won't last.

How much work goes into making a good story? Is it really all that much? If it's not, you should be able to take one of the free "game engines" and make a game. How difficult is it? How do you make it a story you can actually participate in, as opposed to just following a script? Think of making a game from a movie: seems over-constrained.

It seems to me that good story is what really makes a game--for the kind where there even IS a story. Think about it--I think we tolerate less-than-photorealistic visuals for a better story.

So how hard is it to make a really good story? Do you need more than one? Is it even possible to have more than one? They'd mostly have to be disjoint. Perhaps retirement is the time for me to tackle creating a better story for a game. The problem with that is that it is probably going to still feel too linear. If you allow much variability it's going to become very hard to manage reaching a pre-defined endgame conclusion. My goal would probably be to aim for a much less predictable outcome: create a starting point, play rules, and run it more like a simulation, and watch to see what happens.

I need to re-experiment with some AI activities. Can I make something that is largely emergent-behavior and interesting?

Wednesday, June 19, 2013

Distributed File System, part 2

In April, I had a training class called "Intro to Big Data", from Learning Tree. It's really aimed at your getting into Hadoop, but prelim topics were covered first with separate tools. Nice course, really. LT is clearly good at this kind of thing (was my first/only LT course), unlike some other "training" I've had in the last year.

So what sparked my thinking again on DFS/VFS was the segment about Distributed Hash Table. That might work as the lookup mechanism I need to have server as the complete distributed file table.

Making a distributed database is not easy, even the big guys have trouble with this, and overall performance is not all that great. My fave SQL database, H2, is not distrib.

I do not, as yet, know anything about what sort of performance I need. *I* probably don't need all that much, but running my Grid Engine would need more.

Suppose I take a DHT tool (Apache Cassandra is one possibility) have it store this:

filename, directory path, host

where filename is the access key, and maybe host/path is stored as a URL.

filename, URL

If the URL is good, I could pass it to the Grid Engine as is, and let relevant/interested process(es) use it directly to open a file stream. That could work; it could mean having a lot of file streams/handles open at any one time. (The GE typically wouldn't have more than 100 at a time per machine, probably. Well, maybe 200.) So depending on file size, maybe that's too much network traffic; if nothing else, it's not going to scale well.

Maybe I should be using the file-content MD5 as the key? that is at least fixed size (32 chars). That ends up being much more the DHT approach, because you could distribute keys based on the first character of the MD5 (or maybe the first two, if you had a lot of machines).

MD5, URL

So what am I doing with these things? Suppose I have what I think a DHT is: a local service which can tell me where a file actually is for a given MD5; that MD5 has come from the Grid Engine. OK, that feels clunky, because I only know MD5s from the GE.



Other tools: HDFS (Hadoop) has several issues: the "ingest problem" (i.e., how do you get all your data into it), internal replication (it wants 3X, although you can set that to just 1X: you lose any redundancy security, but ingest is faster), and block size, since it uses 64MB ??!! That's maybe not so painful if your files are all 2GB video...

Another reason to NOT try to use a huge SAN cluster (you can daisy-chain these things) is that you end up having to have a minimum block size around 4k or 8k. Well, that's fine if your files are mostly big, but what happens when you tend to have a lot of 1K files? That issue argues for VFS which lets you use (for example) a ZIP file as a file system, which probably gets around the minimum block-size problem; I expect that has other performance issues, but wasted space isn't one of them.

Friday, June 14, 2013

Distributed file system, part 1

There's a lot of data around on a lot of computers everywhere...far too much to fit on any one machine, or even on some kind of larger storage in any cost-effective manner for us little guys.

At work I have a SAN, 100TB available storage. THAT is a lot of storage; but given what I do there, actually not all that hard to fill up. But that kind of device STILL does not solve the larger problem, nor was it very cost effective--I could replace the drives, from 2TB to 3TB, but that would only be a 50% increase...suppose I need a 10X increase? 100X? More?

2TB drives aren't very expensive any more (you know, it seems almost absurd to even be able to say that, given that my first computer had a 20 MB drive in it), and it's not hard to find dirt-cheap machines around, used or even free. Regrettably they are seldom small, and therefore tend to be a little power hungry...not a prob for a data center kinda place, but uncomfortable for me at home.

Suppose I decided I had a problem to work where 30TB looked like the right capacity...and let's say that means 10 machines @ 3TB each...

I've written a heterogeneous distributed OS-agnostic Grid Engine. Perfect for doing data processing on a 10-node cluster. But this really works best when all the nodes are using a shared/common file system. THAT works best with a SAN and a Blade Server, like at work. Well, the blade server part isn't really very expensive ($3k will buy a decent used one that is full, and pleasantly fast--look on EBay for IBM HS21 systems). But getting a SAN on there--not going to happen. OK, I could perhaps put some high-cap 2.5" drives in the blades, etc, but that doesn't solve the resulting problem, which is still how do they share data with each other?

Well, on a limited basis you can make file shares and cross-mount all the shares across all the machines--but that doesn't scale all that far, and those shares all become a nightmare--and they STILL aren't a shared common file system.

So really the problem I have is how to make a shared common file system across a bunch of machines? I need it to be heterogeneous, since I run Mac/Win/Linux machines, and am considering other things like Gumstix.

There are homogeneous file systems around...several, it turns out, although they are mostly Linux-only (FUSE, Lustre/Gluster, etc), which doesn't help me. OK, I could just buy the cheap hardware, and install Linux everywhere, but what happens when I have a windows-only software tool to run?

I've been hunting for an OS-agnostic tool, it's not really clear whether there is such a thing. OpenAFS (i.e, Andrew File System) might do it, which would be perhaps the ideal solution. I haven't tried this yet. Pretty much everything I've read about doesn't meet my requirements, heterogeneous being the first fail point. At work I'm using StorNext with the SAN, but I can't afford that on my own.

So I think I have to solve this myself. What I kinda think I want is a BYOD approach where you'd have to run some agents to join, but you'd have access to everything shared on the network without having to cross mount a zillion things that you can't even find out about casually.

What you would NOT have is something that shows up in Finder/Windows-Explorer. I can probably figure out how to finagle that too, altho I don't consider that a critical requirement. I expect that OpenAFS has that figured out.

Is it going to take an Advanced Degree(tm) to figure this out? It's not an easy problem.

Sunday, May 05, 2013

Young female guitar players

been You-tubin some the last few days...cause a friend pointed me at Joe Bonamassa on YouTube. OK, so I don't know who he is...he's good, but obscure. His better work seems to be when he's playing with someone else more famous and with better tunes. (Later: just found out that YT app is available on iphone from Google--yay! After Apple punted theirs)

and then I happened on Orianthi. and then Desiree Bassett.

Wow.

Neither of them is really old enough to play the blues, in terms of the negative life experiences that hammer your soul the right way. So their original "songs" aren't that great...so you're really there to listen to the guitar work. And that part is phenomenal, most of the time (example where it's not: Desiree plays Jeff Beck's "Because We've Ended as Lovers", pretty much note perfect, as though she had learned it from sheet music--problem is, that's supposed to be a really mournful, melancholy tune, and she plays it with way to sharp an edge, too upbeat)

I wanted to be that kinda good on guitar, but I'm not.

There are a few others...Ariel...Juliette Valduriez.

----------

Been therefore digging on a few other YouTube things...Clapton's Crossroads festivals, which I hadn't heard of before (with DVDs now on order)...Concert For George (which seems not to be available in full on DVD, but the whole thing is on YouTube? [although blocked])

Later: concert DVDs arrived, that is some nice stuff. Now to rip the audio and import into ITunes.

Advanced Software, leading to PDVFS

I generally work on somewhat exotic software projects. Cutting, if not bleeding, edge.

Was early in the Semantic Web stuff 2000-06, the AI stuff in the 80s, other oddments like Digital Mapping (starting in the 80s), text processing (starting mid-90s), I wrote one of the very first GUI builders (late 80s). A health-care R&D effort in the early 90s would still be cutting edge today.

My latest bit of exotic is a Grid Engine. Granted, not anything new, other than mine is OS-agnostic. You can readily find the other grid engines, but they are not really agnostic. Mine runs Windows (XP/7), Linux (probably any flavor) and OSX (at least 10.6+). The whole thing is of course written in Java, which is why it's agnostic. It should run anywhere a Java 1.6 JVM runs properly (possibly including JME, I haven't a way to test there--it would depend on the lightweight thread support).

I'm now processing a lot bigger datasets than I used to, thus the Grid Engine, in order to distribute processing adequately. I have, so far, run it on two systems: 3 machines with 64 total cores, and 12 machines with 48 total cores. It's designed to run on A LOT of machines, but I'm pretty sure that there are undiscovered scale-up problems along the way. There's no imposed maximum.

Because the datasets are now bigger, I have to think about additional problems. In particular, where does that data go? Everything is fine as long as the dataset is under 2 TB, because that fits a single disk just fine, but then you have the issue of how many clients have to be served by that disk, and therefore how much punishment the disk is taking over time; this is the arrangement I have on the 3/64 machines, with no apparent disk degradation yet. If you use a SAN, you can certainly make a much larger apparent single partition; this is what I have with the 12/48 machines, that's a blade chassis with an attached FC-SAN, with 60/15/15/5/5 TB partitions. You set up the SAN for the partition sizes, and use separate software to manage how the blade units see the SAN; works fine, that's really a lot of space, you CAN daisy-chain another SAN onto it, but that isn't really solving the problem--because I've already burned out two disks in it.

I want/need to distribute data differently, so that I am achieving a more random spread of data over storage devices. I want to work this with the grid engine. I need it to be heterogeneous across random hardware.

So of course Hadoop HDFS sounds like a possible, but there are some reason why not. Hadoop is not oriented around this kind of data, where file sizes range from 100 bytes to 3 Gig. Hadoop wants a 64 MB file-chunk size--I don't have that. I need to use native file systems and disk behavior.

Looking at various experimental file systems, nothing seems to do the right job, or be adequately OS-agnostic. There are several Linus-only possibilities, which are probably closer to what I want other than being Linux-only.

Initially I thought I wanted real mounted file-systems. AFS seemed the likeliest solution, but I think that has some problems likely. I don't know what, specifically, except that I wonder what it means to be writing files out--where are they? It looks like a unified file-system, DOES appear OS-agnostic, but...I don't know.

So I'm now thinking about something that isn't actually a file-system, but a P2P-FS-like thing. I need some not-quite-normal capabilities. And I ultimately want it to run on anything that has file storage (or fronts for it, like a NAS). Going to be interesting working this...

Saturday, May 04, 2013

The cars again

Right after xmas I (for some reason I forget) discovered that there were meeces in the garage. I should have put the poison out immediately, but I didn't. They ended up getting inside my lovely XKE and doing some chewing on things. For meeces, and a car that age, that means seat cushion insides, and cloth-covered wiring. So now I have some flaky wiring behavior: sometimes the dashboard instruments cut out. Grr.

I hate meeces to pieces.

Tuesday, March 05, 2013

Buying a house...

Well, we're now trying to buy the retirement house. Which does not mean that we are retiring, but we have this opportunity to by something that has some specific interest.

I'd forgotten how complex and time-consuming that process is...it's been > 20 years since we last bought.

Timeline:
Dallas, Fall 1982: my first house. Not big, but fine for me solo. Interest: 11% ? [ouch!]. ARM.
Fairfax, Summer 1991: first house with family. Wife is pregnant with second child. Interest: 9%
       Refi: 1996. Interest: 7%.
       Refi: 2002. Interest: 5%
Central VA: Spring 2013. Retirement house. Interest 3.875%. Fairfax house refi: 3.75%--yay!

It's been great how the interest rate has come down each time. And as of this writing, Fairfax house will be paid off in about 5 yrs. [later: we did a third refi, it's at 3.75, minimized monthly, but now back out to 30 years]

So why this house? Incomplete information, but it's the middle of really old family ancestry area. Many people in the area will turn out to be distant cousins. The house is pretty cool, in our price range, the view out back is great (instead of car traffic, it's a mountain ridge, although not at the same distance), it's very quiet (cow noise is all you hear during the day). I couldn't afford anything more than what I have now if we stayed here in Fairfax, which would be kinda boring.

Plus, it feels like time for a new adventure. It's a fairly rural area, so we'll be doing a variety of things differently. Our current plan is that we will continue to primary-house in Fairfax for a while, be at new house weekends/holidays/vacation/etc., and gradually move into it permanently over several years. This is an ideal approach, as it lets us slowly figure out what goes where, rather than do it all in a rush, which is what I have *always* done in the past. What will be awkward is furnishing two houses for a while, with things we won't all keep.

Well, it's been an adventure so far. More complex than I remembered, more paper to worry about. As it's rural, different things to worry about: we will have a well, and a septic system. That means no water and sewer bill, or at least not in the same way: a little bit of electricity for the well pump, and an occasional septic pumping out. Undoubtedly less than the monthly amount here in Fairfax. Satellite for cable/internet, maybe phone too. And if I manage to inherit enough money, we may even see about a solar installation, so as to eliminate that as well--the property will have more than enough space.

Sunday, January 20, 2013

Do you/I believe in God?

What do people mean when they say that?

I think it's an implied different question: "Do you believe in God the same way I do?"

The answer to that implied question can't possibly be "yes" because you don't really ever know what the asker's beliefs are. Asker may not really even know him/herself. How many of us really think about it in any real detail?

Do you have to have an Advanced Degree (tm) to figure it out?

My impression is that we are all selective about what exact details of the religious dogma we believe. Including, and perhaps especially, those who are most religious. Given the amount of overall inconsistencies and contradictions in any Bible/equivalent, why do you have to believe anything one way or the other? If we were all really strict about it, how would anyone ever commit a murder?

It probably all boils down to the essentially unanswerable questions of "why are we here?" and "what comes after this?" and the worry about there not even *being* an answer.

One bit of description about the afterlife I've read is that it is as different from this life as being an adult is from being five years old. You can't even *describe* being an adult to a five-year-old, they have neither the vocabulary or the concepts or the experiences to begin to understand completely. So we can't even understand the afterlife, if there even is one. (And really, if there is, and it's overmuch like this life here on earth, with *all the same people* -- do  you really want to go through eternity with the same bunch of idiots you have to deal with now? I don't think so)

So why ARE we here? Perhaps it's simpler to think about how we got here first. Suppose you were God...if you can create this amazing and complex universe, why would you do it? If we can't understand the afterlife, we probably can't understand this either, but it is at least more amenable to speculation.

The universe as we know it seems tuned to increase the likelihood that [intelligent] life will come to pass somewhere. It would not take much change to some basic physical/chemical properties of matter for things to still be a workable universe, but completely incapable of producing life, perhaps not even lasting long enough before collapsing for the basics of life to start. That suggests this universe was designed to eventually produce life, with the eventual outcome that that life becomes intelligent (why would you bother about life that does NOT eventually become intelligent?). Thus, a Creator of some sort.

But if you were God, and you could design and create this universe, why would you do it? To have worshippers? Ick--that's a creepy concept. I can't even imagine a being so smart and powerful wanting such a thing--I would not want to meet/know such a being. To have complete and total advance knowledge of how every single instant of time would go, how every single action of every single atom would combine or change? Seriously? What would be interesting about that? (it'd be worse than being Dr Bloody Bernofski.) It is the process of discovery and the unexpected surprises that are interesting.

I think that what would be interesting would be (if you were able to do so) to create a universe like the one we have, where randomness was an essential characteristic, where there were rules about how physical matter behaves, but designed-in unpredictability (stemming from the randomness). Your goal would really be more about observing emergent behavior. The Evolution of life. A universe with no emergent behavior would not be very interesting. What will happen? You don't know until it does so. How and where will life first develop? How many variations of it CAN develop? There are opposing forces at work--the self-organizing aspects of chemical reactions, and the de-organizing behavior of entropy. Can intelligent life develop at all? Just how far can it evolve? How long can it last? What sort of interesting things can it do for itself? Can intelligent lifeforms evolve in different places and discover each other? How different are they? What can they do together?

Would you ever really meddle in the ongoing development of life? Why? At what level? Because one of the individuals asked you to? I wouldn't. In fact, I wouldn't even be listening like that. What would be the point? Where would it stop? What would make one request better than another?

So are we just the results of God's PhD thesis research? Unknowable, I think. Does it matter? I personally don't care one way or the other.

So I probably don't believe in God the way you'd mean it if you asked me.

[Later: this past week was "talk with young atheists" on NPR (well, it wasn't called that, but it WAS that). The question posed was really "Did they believe in God? how did they lose the belief?" Nobody said anything like the above.]