Den Ben’s Blog

February 29, 2012

Retiring the blog – for now

Filed under: Propaganda — benpittoors @ 19:50

It’s just not my thing keeping a blog up to date with all the stuff that interests me (as you may have noticed). There are more important things in life than showing off by writing an endless series of blog posts about the most obvious things, or about bashing technologies, or about ranting against companies/people, or about how f*in smart I am.

I’ll leave it online though since my Korg M50 – Reaper post is vaguely popular and already helped out a few M50 (and M3) owners.

Meanwhile, I still experiment a lot with music, cooking :), Android, BI and lately also GIS technologies. You can follow me on twitter and/or circle me on Google+

Who knows what the future will bring – look out for that next leap day :). If I ever find a good balance in my life I may write an occasional blog post about it although chances are that I’ll prefer posting it on my Google+ stream instead.

December 6, 2009

Using a KORG M50 together with Reaper (DAW)

Filed under: Music Production, Reaper — Tags: , , , , , — benpittoors @ 21:32

A little background info

About a week ago I bought myself a shiny new synth! Well, it’s actually more of a workstation than just a plain synth. I thought it was time for 88 weighted keys backed up by some seriously good synth capabilities. After googling around a bit I decided to go for a brand new KORG M50 88. I almost bought a second hand M3 88 but the seller wouldn’t bend for my maximum bid… can’t really blame him, it would have been a great deal for me. The only sad thing is that he did a final counter offer AFTER I bought the M50… too bad: I already had my gear, and in the end it still may be better for me to have a brand new M50 instead of second hand M3. Although I will be missing out on the M3’s sampling capabilities and built in KARMA.

Still, I’m really happy with the M50. I don’t really need a sampler anyway and for the KARMA… well, I can still upgrade to software KARMA (running on a PC/Mac) in the near or distant future.

After playing around with the keyboard in standalone mode I quickly noticed that the built-in sequencer is just not my cup of tea. It’s way too tedious to perform even the simplest of operations (think select, cut, copy, paste kind of stuff). Frankly, I expected it to be like this and to be honest it is still better than the built-in sequencer of my MC505…  but since the main usage of the synth will be as a controller / sound module inside my home set up I am not planning to pre-program combi’s, sequences, arpeggiators and other stuff for any live gigs.

Introducing Reaper

As you may know, Reaper is a pretty complete, solid, full-featured digital audio workstation (DAW) that has had some pretty good reviews. It has been a while since I used this kind of software (I have an ancient Sonar version laying around here somewhere) but coming in at a price point for home usage you can’t beat I thought I’d give Reaper a spin and see where it takes me. The evaluation license gives you a fully operational environment that you are allowed to use for 30 days (not enforced) with the only ‘annoyance’ being a 5 second start-up delay.

After fiddling around for a while I finally managed to get the KORG M50 MIDI-integrated into Reaper and I thought I’d share this process through the means of this blog post.

Prerequisites

You need to install the KORG M50 USB MIDI driver, and the M50 Plug-In Editor VSTi from the supplied CD-ROM (or alternatively, download them from the KORG website).  Hook up your M50 with your PC using an USB cable (or alternatively hook it up using plain MIDI cables… I can’t vouch for this 100% since I haven’t tried that out myself but that should also work).  You can test the connectivity by launching the Standalone M50 Plug-In Editor and seeing if it synchronizes with your M50.  After you’ve tested it, close the standalone editor again since it will prevent the VSTi from loading otherwise.

You’ll also need to install Reaper of course :)

This blog post is about getting the M50 Plug-In Editor operational inside Reaper.  Which means you will have perfect MIDI integration.  I won’t explain how t0 record the M50’s audio inside Reaper… I may do so in a future post (and on a side note, you will need to hook up the M50’s audio out with your PC’s audio in in order to do so since the USB connection does not carry any audio signals; only midi).

Configuring Reaper

In order to make Reaper find the VSTi you’ll need to configure it’s location.  The default installation folder is %program files%VstpluginsKORG which on my Vista 64 system translates into “C:Program Files (x86)VstpluginsKORG”.  You can do this by opening the menu ‘Options – Preferences’ and navigating to Plug-ins / VST.  Fill in the correct location and click the ‘Rescan directory’ button to let Reaper find the VSTi and add it to its repository.

Also important: The Plug-In will take control over your M50 so make sure it is disabled in Reapers’ MIDI configuration.  In the same screen choose Audio / MIDI devices and disable M50 input and output if that is not the case.

Add a VSTi instrument track

Choose the menu ‘Track – Insert virtual instrument on new track…’.  Choose All Plugins/Instruments and if you’ve configured Reaper correctly it should list VSTi: M50 Plug-In Editor (x86) (KORG).  When you choose the plug-in and click ‘OK’ it should start synchronizing with your M50 (this can take a minute so be patient) eventually showing the plug-in UI.

Configure the M50 VSTi

Some additional settings need to be done on the M50 to be able to use it both as a midi input controller and as a midi playback device.  You can do this on the M50 itself, or by using the Plug-In Editor VSTi.  I believe I can best explain this using the VSTi:  Click the ‘Global’ button and press the MIDI tab.

Check the Basic ‘Local control on’.  This will allow the M50 to listen to its own MIDI commands while recording them in Reaper.

Set the MIDI Clock to ‘External USB’ (or alternatively ‘External MIDI’ if you hooked up your M50 using MIDI cables instead).  This will give Reaper the control over the MIDI clock (tempo and transport controls…)

Any changes you make inside this configuration are not immediately synchronized with your M50.  In order to apply the changes press the ‘DUMP’ button to send these settings to your M50.

And… also an important setting (which took me a while to find out about) is on the Global – Software Setup tab. Make sure the option ‘Send M50’s MIDI Out data to the host application (VST Plug-In only)’ is checked! This is a plug-in configuration setting so you do not need to synchronize this with your M50.

Keep it simple: for the moment I suggest you choose a simple program instead of a combi.  Experiment with the use of combi’s and drum tracks after you’ve accomplished this first.

Almost there…

Reaper can record lots of different types of input.  The track with the M50 VSTi needs to be set up to record the output of the VSTi (yep, the VSTi takes the input from the M50 and routes it to its own output).  Click the ‘Select Recording Mode’ button of the track (it should be defaulted to ‘in’ and is the second button on the lowest row of the track).

Choose ‘Record: output’ – ‘Record: output (MIDI)’.  When you’ve selected that, the small button ‘in’ should be changed into ‘out’.

Done!

That’s right.  Now you can record, edit and play back MIDI from within Reaper using your M50!  You still have only a single track for now, but you can easily add more in the same way. Do make sure however that the midi channels between the different plug-in instances do not overlap (unless you want them to…).  Start experimenting with combi’s and drum tracks… I’ve pointed out most of the important settings, but there are many many more :)

Update

You cannot run multiple instances of the VSTi.  However, you won’t need to since the VSTi controls all the M50’s midi channels when you put it in Sequence mode (or half of them in Combi mode).

March 17, 2009

Convert SVG to XAML

Filed under: Silverlight, WPF — Tags: , , , , , — benpittoors @ 20:00

For one of our current projects I needed to convert some hazard symbols from SVG to XAML to use them in a Silverlight GIS front-end.  After querying the web I found several tools that supposedly do so.  However none of them fitted my needs.

  • A few of them where payable products; and they only offered severely crippled demo versions
  • One of them was a plugin for Adobe Illustrator, and I do not have it :)
  • Inkscape has an export to xaml function (Save As -> Microsoft XAML), but trying that on the hazard symbols mentioned above (you can download the SVG’s following that link) turned out to be no good.  While the xaml could be parsed correctly from a technical view point, it rendered images that where nothing like the original SVG’s

One of the search results mentioned the Microsoft XPS Document Writer printer driver.  And to my suprise, you can in fact use that one in the process of converting SVG’s to XAML :)

Here’s what I did:

This may seem a bit tedious, but in fact works quite well once you have done a few conversions :)

First, you need an SVG file.  For example: this one

Secondly, you need to open that SVG file in Inkscape (I have had no luck printing the SVG from my browser, probably because it uses some weird plugin to render the SVG).

Then print the SVG from within Inkscape to the Microsoft XPS Document Writer.  You have to specify a file name:

xps_filedialog

Then go to the location of the .xps file and rename its extension to .zip (yep, apparently it is also a zip file :))

Browse the zip file for \Documents\1\Pages\1.fpage and extract that file.

Optionally rename its .fpage extension to .xml to open it up in your favourite xml editor.  Alternatively leave the file as-is and open it in any text editor (notepad will do).

You should see something like:

<FixedPage Width="816" Height="1056" xmlns="http://schemas.microsoft.com/xps/2005/06" xml:lang="und">
	<Path Data="F0 M 4.32,528.8 L 528.8,528.8 528.8,4.32 4.32,4.32 4.32,528.8 z" Fill="#ffff7d00" />
... snip ...
</FixedPage>

If you copy-paste all the lines that are enclosed between the <FixedPage> tags inside a <Canvas> tag you have fully functional XAML ready to be used in your Silverlight or WPF application :)

March 8, 2009

Getting my hands dirty again

Filed under: Propaganda, Software Development — benpittoors @ 14:51

It’s been more than a year since I last actively participated as a developer in a development team.  Sure, I wrote some kick-ass T-SQL scripts and the occasional line of C#.  But, over the last year my tasks were more targeted to data modeling, functional design and project- and team coordination rather than to writing source code.

This was up until about 3 weeks ago, when I was given the chance to write some code again.  I wrote most of the functional design for the project and a team of 3 developers already implemented a great deal of it.

Now, during the time of my source code writing abstinence, my colleague Davy Brion had done serving his time in Enterprise Hell and was asked to strengthen our in-house software development team.  Davy introduced a lot of technologies and patterns that extended the way we previously developed our software projects.

He ditched our code generated data layer and introduced NHibernate instead.  Being a big fan of dependency injection he also threw Castle Windsor into the picture.  And to top it of, for our unit tests he showed us how to easily mock dependencies using Rhino.Mocks.

It only took me about a day or 2 to fully understand and start using the new architecture in a productive way.  So how did I manage to grasp these concepts so quickly?  Well, the answer is easy: they aren’t all that hard to understand!

NHibernate

While having had no experience in NHibernate at all, I already knew the general idea of ORM.  And as I joined the development in a stage of the project in which already alot of funtionalities were implemented, I had some real life examples of mapping files, entities and data repositories laid out in front of me.  Taking it from there was not that hard.  Although for the occasional question it is kinda handy to have an NHibernate developer in your team.

Castle Windsor

Again, I knew about dependency injection as a way to implement strategies and the likes but seeing CW in action really blew my mind.  You just register your components in code (or xml) and don’t care about them anymore.  The IoC will automatically inject the configured dependencies for you (as long as you provide the means to do so by adding the corresponding constructor parameters or public properties).

Rhino.Mocks

We already were writing unit tests for a few years (as is the core of our genesis development methodology) before Davy introduced Rhino.Mocks.  Some of the techniques a mocking framework solves were not new to me.  I already had stubbed some dependencies with manually written ‘test’ code.  Again, the simplicity of creating mocks, stubs and expectations with Rhino.Mocks really impressed me.  And the unit tests are _fast_!

So, All’s Well That Ends Well?

I did manage to screw one thing up though :)  I wrote a piece of code that created a new entity in memory and that did not initialize a non-nullable field.  Since the tests for that code, mocked the data repository that is responsible for persisting the entity, they ran just fine while in fact they covered faulty code.  Adding an extra test for every non-nullable field would solve this problem, but I don’t believe that is the best approach (a bit tedious don’t you think?).  We’re already thinking about an elegant solution for this… maybe one day you’ll read about that on Davy’s blog orso… :)

December 2, 2008

Software Factories and Frameworks. How to maintain infrastructure code?

Filed under: Software Development — Tags: , — benpittoors @ 20:53

In a reaction to Davy Brion‘s post about how to deal with common infrastructure code for multiple projects I’d like to share my take on said topic.  I already placed a brief comment on Davy’s blog but maybe I need to go in a bit more detail to make my point fully clear.

Davy lists these 3 options, each with its pro’s and con’s:

  1. Infrastructure code as a separate project.  Binary dependency per client project.
  2. Infrastructure code as a separate project.  Copy source code into client project repository.
  3. No separate infrastructure project.  Each project contains its own infrastructure code.

If you place these options into following table, you get ‘more consistency over different projects’ on the left hand side and ‘more flexibility’ on the right hand side:

Option 1 Option 2 Option 3
  • Highest level of consistency
  • Requires a lot of discipline in versioning and maintaining
  • Has to provide many extensibility points for clients to implement their custom behavior
  • Least flexible
  • Inconsistencies possible between client projects
  • Still requires a certain level of discipline for maintaining different versions
  • Also requires extensibility points for custom implementations
  • A bit more flexible
  • Inconsistencies between client projects are as good as certain
  • No need to maintain different versions
  • No extensibility points required
  • Very flexible

The question is:  What option would you take?

Easy question, not so easy answer!  My take would be not to pick a single option, but depending on a specific part of the ‘framework’, categorize that part into one of the 3 options.  So, in the following part of my post I’ll address these 3 options as category 1, category 2 and category 3 respectively.

Category 1 code should be (just my first thoughts, some of these points may be hard to implement):

  • Ui paradigm independent.  Code in this category should not need to know whether the client project is an asp.net-, a winforms-, a wpf- or even a silverlight application.
  • I’m thinking mostly interfaces and abstract classes here… Extensibility is key.
  • Very generic code.
  • Probably a relatively small code base.

Category 2 code should be:

  • More concrete, but still reusable over different projects.
  • I’m thinking default implementations here (e.g. an ActiveDirectoryAuthentication component, an EventViewLogger, an ASP.Net BasePage, etc.)
  • If proven to be very stable (both in behaviour as in requirements) can possibly be moved to category 1.
  • And vice versa: if proven to be unstable (i.e. changes quite often), move to category 3.

Category 3 code should be:

  • Very specific for the client project.
  • Ideally some components start their life cycle in this category, but are moved/refactored to category 2 if their reuse becomes apparent by taking on specifications of new client projects.

Now, how to organize these categories in assemblies, source control, etc. is not part of this post.  I just wanted to share my theorethical view on this.  Turning this into something practical is the next step…

October 25, 2008

Why does Apple memory cost so much?

Filed under: macbook, Propaganda, Rant — Tags: , , , — benpittoors @ 20:08

After installing a .Net development environment in a VirtualBox Vista machine on my macbook, I decided to upgrade from 2GB to 4GB so I fired up a web browser to the Apple Store. Man, was I disappointed…

The upgrade kit that includes two 2GB DDR2 (PC5300) laptop memory modules is listed at 280 EUR. And that was about 3 times as much as I was willing to spend on it.

Of course, I knew Apple would charge more than average for a memory upgrade. And I’m certainly not the first person to have noticed that :-). But I bought the memory for a mere 50 EUR some place else. That’s right! Less than 1/5th of the Apple Store price. So ‘above average’ is kind of an understatement here…

Ohw, and for those of you wondering: I bought “Kingston ValueRAM SO-DIMM 4GB DDR2 Kit (KVR667D2S5K2/4G)” and it works just fine in my macbook. 2 runs of memtest reported no errors.

Side note: the VirtualBox Vista experience index with 2GB’s of allocated memory for ‘Memory’ went up from 4.5 to 4.8 ;-)

October 21, 2008

Running a virtual Vista on OS X without Parallels Desktop or VMWare Fusion

Filed under: macbook, Propaganda — Tags: , , , , , , — benpittoors @ 21:13

Right! A one hour a day limit. Sure…

Ohw-kay… Setting a one hour a day limit seemed a bit optimistic at the time. But then again, I just didn’t find the time to blog about all the interesting things (uhuh) I did over the last few months. I know, I know, it’s pretty embarrassing; showing of my knowledge and all. But I’ll just smile about it, and act as if nothing happened.

On topic again (almost)

Among other things I neglected to do over the past few months (OK, OK, I’ll drop it in a moment… just bear with me on this. I promise there is a point) I also lost interest a bit in my macbook. Which is a shame, because it’s an awesome machine (mine is a 3,1 by the way). The reason for this, mostly, was because I do not have a .NET development environment on it. And since I also own a 4GB RAM Intel Core 2 Duo Vista PC I just didn’t find any reasons to invest in Parallels Desktop or VMWare Fusion. But luckily my music composing hobby recently led me back to the mac. And once again I immediately felt comfortable in Leopard. So I googled around a bit for other virtualization options on Mac OS X and I came across this page. You’ll find lots of links to virtualization software and (sometimes very) brief descriptions of what the software does and is capable of. I decided to give the best free listed alternative a go, which is VirtualBox. I’m very pleased with it so far and although I’m not going to write an extensive review about it I do feel I have some things to share… ohw, and these are my experiences with VirtualBox on Mac OS X (10.5.5) running Vista 32bit as a guest OS but VirtualBox is able to both run and run on Windows or Linux too.

Installation

Installing VirtualBox is as easy as any other software installation on OS X. You download a .dmg, it automatically gets mounted, you run the installation package, follow the installation wizard and the thing ends up in your applications folder. It also places a bunch of user files in your home folder/Library/VirtualBox location.

Creating a new Virtual Machine

There’s a wizard ;-) Most of the options will look very familiar if you have any experience with other virtualization software (Virtual PC, VMWare…). Apart from naming your new virtual machine you have to choose an OS Type first. The list is very long, ranging from DOS over the most common linux distributions to all possible Windows versions and even OS/2 Warp. I picked Windows Vista… since that was what I was about to install.

Then, it suggested a base memory size of 512MB. That seemed a bit low to me, so I upped it to 1024 (fyi, my macbook has 2GB of RAM)

After that, I needed to create a virtual hard disk. I chose a dynamically expanding one (less space in the beginning) giving it 30GB to go (which is the size it will report to the guest operating system). Unlike Parallels or VMWare Fusion you do not have the option to mount a bootcamp partition. Which is fine by me… I’m not planning to install a native Windows on my mac any time soon.

Before firing it up I mounted a Windows Vista evaluation copy installation iso. This couldn’t be done straight away. I had to add it to VirtualBox’s image library first. Not that it wasn’t easy to do so, but it seemed a bit odd to me not just being able to mount any iso image I browse for. On a side note: I mounted the iso over an smb network share and this wasn’t an issue at all. You also have the option to map your physical CD/DVD drive by the way.

Vista Installation

Upon starting up the virtual machine VirtualBox kindly notified me of the host key, which on Mac OS X is the left cmd key. The host key is used to release the keyboard and mouse capturing back to the host operating system.

The Vista installation took about half an hour. Then I noticed there was no network support. Apparently Vista didn’t recognize the standard emulated network drive. This I found in the VirtualBox manual, which of course I did not read up front ;-). The solution was very easy: I just had to install the VirtualBox guest additions (which are very similar to Virtual PC Additions). Apart from a custom network interface driver it also installed a bunch of other things enabling the guest operating system to resize its host window, automatically capture/release mouse input etc.. very nice! One remark however is that the resizing doesn’t really come smooth. It takes a while to resize the host window (visual delays) and the contents are scrambled during the resize process. But once you’ve sized the window to your desired resolution it feels snappy again.

Downloading and installing service pack 1 + all remaining windows updates took about an hour and a half extra.

Performance

While I haven’t really used the virtual machine extensively – I plan on doing so in the near future – I let Vista measure the performance for me. This resulted in a Windows Experience Index of 1.0! Not really that impressive now is it? Well, it all comes down to the base index of the emulated graphics interface. Breaking down the index into its several measures yields a more satisfying result:

  • Processor (Calculations per second): 4.3
  • Memory (RAM) (Memory operations per second): 4.5
  • Graphics (Desktop performance for Windows Aero): 1.0
  • Gaming Graphics (3D business and gaming graphics performance): 1.0
  • Primary Hard Disk (Disk data transfer rate): 5.9

Again, my macbook is macbook3,1 with a 2.2Ghz intel core 2 duo and 2GB RAM (667Mhz) so that about explains the 4.3 and 4.5. The emulated graphics are not exactly showcases but that is a classic issue for many virtualization engines (although I’ve read Parallels has added 3D acceleration support in its latest version). The disk index of 5.9 (which is the maximum possible value at the moment) is really impressive. I guess VirtualBox does a really good job at handling dynamically expanding hard drive images.

Conclusion

I like it! I’m still not convinced that it will keep me away from my physical Vista machine, but I do plan on starting to use this virtual one occasionally. The graphics performance might become an issue if I ever plan on using or even writing any WPF demo’s or the likes, and the memory limit of 1GB will most likely not do me any favors regarding database engine, OLAP – or web site performance but it is a very small step towards mac only hardware. Although that also implies me buying a bigger mac (imac, macbook pro, mac pro?) with alot more RAM in the distant – or near – future… which is a thought I can stand to live with ;-)

January 15, 2008

Resolving a KPI’s Range into its Status

Filed under: Business Intelligence, SSAS 2005 — Tags: , , , , , — benpittoors @ 22:04

The Analysis Services KPI Framework is quite powerful. Each KPI consists of 4 major attributes that can be solved through a corresponding MDX expression:

  • Value – the value of the KPI
  • Goal – the goal of the KPI, what you like the value to be/become
  • Status – indicates if the KPI is bad (-1), neutral (0) or good (1)
  • Trend – indicates how the KPI is doing over time

The fact that Status resolves to a value between -1 and 1 serves a pretty nice purpose.  The system and possible client applications can interpret the meaning of the KPI: i.e. is it good or is it bad.

However, it can also complicate things.  Especially when you’re not that good at math, finding a correct formula for the Status expression can proof to be quite challenging.  Let’s say you have a measure [Invoice Aging] that contains the amount of days between an Invoice Date and its Closed Date (a late arriving fact when the invoice is fully paid for).  Your CFO wants to know when there are invoices that age longer than 30 days.  He states that if an invoice is paid before the 30th day, than that is a good thing.  On the other hand, if an invoice ages 60 days or longer, than that is bad.  Ideally an invoice is closed the same day that it was created (zero days).

If you would translate this into a mapping for 0=1 (good), 30=0 (neutral) and 60=-1 (bad) you’d get this:

Value Status Value Status Value Status
0 1 20 0.33 40 -0.33
1 0.97 21 0.3 41 -0.37
2 0.93 22 0.27 42 -0.4
3 0.9 23 0.23 43 -0.43
4 0.87 24 0.2 44 -0.47
5 0.83 25 0.17 45 -0.5
6 0.8 26 0.13 46 -0.53
7 0.77 27 0.1 47 -0.57
8 0.73 28 0.07 48 -0.6
9 0.7 29 0.03 49 -0.63
10 0.67 30 0 50 -0.67
11 0.63 31 -0.03 51 -0.7
12 0.6 32 -0.07 52 -0.73
13 0.57 33 -0.1 53 -0.77
14 0.53 34 -0.13 54 -0.8
15 0.5 35 -0.17 55 -0.83
16 0.47 36 -0.2 56 -0.87
17 0.43 37 -0.23 57 -0.9
18 0.4 38 -0.27 58 -0.93
19 0.37 39 -0.3 59 -0.97
60 -1

Now, needless to say wrapping this table hard-coded into one giant MDX expression is not what you’ll want to do ;-)  Good old math to the rescue!

As you can see in the table above, the status is pretty linear in accordance to its value.  It just needs to be normalized from its value range (0…60) into the KPI’s status range (-1…1).  Following expression does just that:

(60 - [Invoice Aging]) * 2 / (60 - 0) - 1

And that’s about it.  But lets get into the semantics of this formula before ending this post.  The number 60 occurs in it twice.  That number defines the top end of the value range in the above example.  The zero is there for a reason also… Yes, in this example it is obsolete but it also defines the bottom end of the value range.  And that knowledge comes in handy because our CFO just changed his mind about the good and the bad of our nifty little KPI (off the record: CFO’s have this nasty tendency to do that allot, as do any CXO’s).  He now states that indeed 60 or more days is a bad thing.  But if an invoice is paid on the 30th day then that is still a good thing.  So our value range changes from (0…60) to (30…60).  Changing the formula to incorporate this change is pretty easy:

(60 - [Invoice Aging]) * 2 / (60 - 30) - 1

You might also have noticed that this formula resolves the lowest value into the highest status (because low invoice aging is a good thing).  In most cases however, like a Sales figure for example, the highest value has to resolve into the highest status.  Say, that when you sell x cars or less, then that is to be considered as a bad thing.  And if you sell y cars or more, then that is good.  Then this would be the formula:

(y - [Cars Sold]) * -2 / (y - x) + 1

As you can see, the 2 has changed into -2. This is because in our previous example it actually defined the target range (of the status value).  The invoice aging had to resolve to ((1) – (-1)) which equals 2.  But now the amount of cars sold has to resolve to ((-1) – (1)) which equals -2.

Now, the thing about the substraction (reversed status) and the addition of 1 (normal status) is something I cannot really explain.  I know they need to be there in order to get the correct results but I’m a bit puzzled about their actual meaning.  As I am not a mathematician I can only guess and so far I haven’t guessed anything worth mentioning yet.  Feel free to comment if you know a decent explanation though ;-)

In conclusion of this post I’d like to state that things like this can always be calculated by SSAS and in the end you will be beter of following a mathematical approach instead of going the oh-so-messy Iif() way… 

December 20, 2007

Why you should base your Facts Data Sources on Views

In SQL Server Analysis Services 2005 you can use different methods to map your source data into your Analysis Services Database’s Data Source Views. You can map your data source model one-on-one onto your physical tables in your (dimensional) database or you can map it on database views. Needless to say, the latter provides an extra level of abstraction. Which is almost always a good thing, imo. You could also use named queries to map your data (the queries reside in your SSAS project) but then you’d lose the abstraction layer again: If the underlying data model changes you’d have to change the SSAS database too while if you base your data sources on views you’d just have to alter those instead.

Now, if you want to enable partition write-backs (which I think is a very interesting feature for forecasting and what-if analysis scenarios) you can only do that on measure groups containing measures of which the AggregateFunction is set to SUM (additive measures). So this can be a problem when your fact table contains both additive and semi-additive measures. As I said, you won’t be able to use write-backs on the semi-additive ones but if you’ve mapped your facts one-on-one you cannot enable them on the additive ones either since they will be in the same measure group (nope, you can’t create multiple measure groups on the same fact table in your data source views).  Using database views or named queries you can easily get around this restriction.

But the views still provide a greater level of abstraction than the named queries…

December 19, 2007

Setting a one hour a day limit

Filed under: Propaganda — Tags: , — benpittoors @ 21:01

If you surf the web for a few minutes; type in a search query into your favorite search engine; then you’ll probably notice that millions of people enjoy their daily blog posting activities.  And I must admit, the thought of showing of my knowledge to the world through the same means – a blog – has crossed my mind more than once.  Well, the inevitable has happened: This is my first blog post!

The title of my first post indicates that I’m setting a limit however: I absolutely will not spend more than one hour a day on this blog. Most of the days I even won’t put any time in at all.

As a Software Architect I have to deal with a whole bunch of technologies every day.  So that’s what I’ll be posting about the most I guess.  So… what can you expect here (assuming you’d expect something that is)?

  • Business Intelligence related stuff.  This can go from dimensional modeling topics to SQL Server Analysis Services 2005 (2008?) cube design.  I’m still a novice in that field but I intend to become a true data warehouse guru ;)
  • C# and the .Net framework… and even Java if I happen to be in that camp at the moment of posting.
  • Agile development methodologies (Genesis)
  • I love my MacBook running Leopard so… who knows :)

And that’s not an all inclusive list!


The Shocking Blue Green Theme. Create a free website or blog at WordPress.com.

Follow

Get every new post delivered to your Inbox.