Tuesday, April 14, 2015

DSLs With Ruby Treetop

I have been playing with Ruby quite a bit lately.  One of the things I have been working on is a simple data query language to interface with our back-end databases.  At my job this must work on an old UniData database (running on a HP-UX 10.2 box) and SQL Server.  I wanted a language that looks much like the following:

Data Query Language Syntax Help

General Commands:
Comamand History:            list [NumberOfCommandsToReturn] commands
Exit Program:                exit
Reparse The Grammar:         reparse grammar

Macro Commands:
DQL Macros Are A Series Of Commands That You Want Executed
Begin Macro:                 begin macro
Clear All Macros:            clear all macros
Delete All Macros:           delete all macros
Delete Macro:                delete macro ["MacroName"] 
Edit Macro Save File:        edit macros [from Filename] in Notepad
End Macro:                   end macro [into "MacroName"]
List Current Macros:         list macros
List Macro Commands:         list macro "MacroName" commands
List Macro Files:            list macro files
Load Macro File:             load macros [from FileName] 
Run The Macro:               run macro ["MacroName"]
Save Macros To File:         save macros [into FileName]

Database Object Queries:
List Table Names:            list [avante| dw | datawarehouse] tables [into ResultSetName]
List Table Columns:          list [avante| dw | datawarehouse] table "ITMMST" columns [into ResultSetName]

Database Queries:
Run Query:                   run [avante| dw | datawarehouse] query "LIST ITMMST PART.NBR DESCRIPTION" [INTO ResultSetName]

Result Set Commands:
Copy Results:                copy result set SourceResultSetName to DestinationResultSetName
Delete Results:              delete ResultSetName
Filter Results:              filter ResultSetName by "FilterString1" [and "FilterString2" and ...]
List Result Sets:            list result sets
Match Results:               match ResultSetName on "MatchString1" [and "MatchString2" and ...]
Output To Excel:             output ResultSetName to excel file "ExcelFileName"
Revert Results:              revert ResultSetName
Show Results:                show [NumberOfLinesToReturn] [ResultSetName]

I wanted the ability to run queries on both of our main enterprise databases and do simple filtering/matching on the results (called Result Sets in this system).  I also wanted the ability to create a series of commands and save them in a "macro" that I could run later.  I wanted to be able to save these macros to a file to be run later as well.  In this way I could build up libraries of commonly used commands.  Finally, I wanted to be able to call up lists of database objects I am always searching for (table names and columns).

I knew that the "easiest" way to achieve this would be to create a DSL that encapsulated all these commands.  You could do it by manually parsing regexes but that would be no fun.  Luckily Ruby has many tools to do DSLs.  I chose Treetop to create my new language.  Treetop is a great tool.  The only real negative to using it is the complete lack of good documentation.  Hopefully this post may help someone avoid bashing their head into a wall like I did.

In Treetop you build you language using strings and regular expressions.  Let's take an example from the language we created above:

List Table Names:            list [avante| dw | datawarehouse] tables [into ResultSetName]

I represented this in a rule that looks like this:

rule db_command_list_tables
'list' space database_type space database_object space? optional_resultset:into_resultset?
{
def evaluation_code
eval_code = ""
my_result_set_name = 'default'

# Write Code To Get The Table Names
case database_type.text_value
when 'avante'
if database_object.text_value == 'tables'
eval_code = "result_set = avante_db.get_table_names"
end 
when 'dw','datawarehouse'
if database_object.text_value == 'tables'
eval_code = "result_set = datawarehouse_db.get_table_names"
end
end
# Write Code To Handle Optional Result Set
if defined?(optional_resultset.result_set_name)
my_result_set_name = optional_resultset.result_set_name.text_value
end 

# Add ResultSet To Result Set Dictionary
eval_code = eval_code + "; result_set_dict.add_existing_result_set(result_set,'#{my_result_set_name}')"

# Probably Want To List Result Set By Default
if my_result_set_name == 'default'
eval_code = eval_code + "; puts result_set.value; puts \"\\n\" + Rainbow(\"Results Stored In Result Set: default\").green"
end 

return eval_code
end 
}
end 

rule database_type
  'avante' / 'dw' / 'datawarehouse'

end

rule database_object
  'table' [s]* / 'query' / 'data' / 'info'

end

rule into_resultset
  'into' space result_set_name

end

rule space
  [\s]+

end 

The db_command_list_tables rule builds on the database_type, database_object, into_resultset and space rules to create a complete command.  One important thing to note is that when you make something optional (by appending a question mark to the end of it) it no longer is able to be referenced directly.  An example of this is the into_resultset? part of the db_command_list_tables rule.  We added a question mark to the end of it.  Notice that I prepended "optional_resultset:" to into_resultset?  This enables me to later use this code to look up its value:

# Write Code To Handle Optional Result Set
if defined?(optional_resultset.result_set_name)
  my_result_set_name = optional_resultset.result_set_name.text_value
end 

It would make sense if your were able to use a construct like into_resultset.result_set_name.text_value to reference the result set name.  This will fail.  Instead you must prepend optional_resultset: to into_resultset to look into the into_resultset rule.  Why?  I have no idea and it took me forever to figure this out.  I re-iterate ... it is completely necessary to prepend an identifier to any referenced rule you make optional (with a question mark).  This is nowhere (that I could find) in the "official" Treetop documentation and it will trip you up.

The final item of interest is the:

{
  def evaluation_code
  ....
  ....
}

construct inside the rule.  This enables you to use Ruby code to process the text data from rules that match your input.  How meta ;-).  What this enables you to do is create code that would be evaluated when a rule matches certain input.  To use these rules you have to compile the Treetop rules into a file and load them into a parser.  I used a parser class to accomplish this:

class DQL_Parser
  
  def initialize(grammar_path)
    @path_to_grammar = grammar_path
    
    Treetop.load(grammar_path)
    @parser = DQLParser.new 
  end
  
  def reload_grammar()
    puts Rainbow("Reparsing Grammar").green
    Treetop.load(@path_to_grammar)
    @parser = DQLParser.new
  end
  
  def parse(command)
    tree = nil
    if command !~/^\s*$/
      # Pass the data over to the parser instance
      tree = @parser.parse(command)
      # If the AST is nil then there was an error during parsing
      # we need to report a simple error message to help the user
      if(tree.nil?)
        puts Rainbow("ERROR: Cannot parse \"#{command}\".  Error is at offset: #{@parser.index}").red
        puts Rainbow((' ' * (@parser.index + 21)) + '^').red
        puts Rainbow((' ' * (@parser.index + 21)) + '|').red
      end
    end
    return tree
  end

end

$parser = DQL_Parser.new($base_path + '\Data_Query_Language.treetop')
.... some code to set up a REPL
parse_tree = $parser.parse(command)
if !parse_tree.nil?
  if parse_tree.evaluation_code !~ /^ERROR\:/i
    if parse_tree.evaluation_code != ''
      eval (parse_tree.evaluation_code)
  ....

These commands create a parser using all of the commands contained in the Data_Query_Langague.treetop file.  Then I parse the commands a user enters with $parser.parse.  If the parser tree is not null and if it returns something from the evaluation_code ruby code then we have a winner.  We now take the evaluation code returned from the parser and evaluate it with Ruby's eval method.  If we put this code in a loop then we have a REPL that we can use to run a series of DQL (Data Query Language) commands.

The key thing to remember is that the evaluation code is run in the context of the Ruby program that is creating the parser instance.  By using the Ruby readline and rainbow gems you can create such convenience features as auto command completion and colored error/information output.

The only things left for me to do were to create the classes that would enable me to interface with my databases and easily manipulate result sets that they returned.  This was accomplished using the DBI, net/telnet, net/ftp and good old fashioned Ruby classes.

My database classes had the following structure:

class DBInfo
  def initialize

  end
  
  def log_in
    
  end 
  
  def run_query
    
  end
  
  def get_table_names
    
  end
  
  def get_table_cols
    
  end
  
  def close
    
  end  
end

My SQL Server and UniData classes inherit from DBInfo.  Your classes would depend on the type of databases you were interfacing with.

The more interesting classes are the result set classes (excuse the mess, I haven't had time to properly re-factor these):

class ResultSet
  attr_reader :query, :query_type, :column_chars, :name
  
  def initialize(orig_result_temp_file, query, query_type, column_chars = '|', name = 'default')
    @orig_result_file = orig_result_temp_file
    @curr_result_file = @orig_result_file    
    @query = query
    @query_type = query_type
    @column_chars = column_chars
    @name = name
  end
  
def value
    # Read Results From (Possibly Filtered) Temporary Result Files
    if File.exist?(@curr_result_file.path)
      return File.read(@curr_result_file.path)
    else
      puts Rainbow("ERROR: Result File \"#{@curr_result_file.path}\"Does Not Exist").red
      return ""      
    end 
  end
  
  def original_value
   # Return Original Results (Before Filters And Matches)
    if File.exist?(@orig_result_file.path)
      return File.read(@orig_result_file.path)
    else
      puts Rainbow("ERROR: Result File \"#{@orig_result_file.path}\"Does Not Exist").red
      return ""      
    end     
  end
  
  def print_line_in_columns(lines_to_display = 0)
    line_counter = 1
    
    if File.exist?(@curr_result_file.path)
      # Even At One Byte Per Line It Will Never Have More Than This Many Lines
      if lines_to_display <= 0 then lines_to_display = File.size(@curr_result_file.path) end
      
      File.readlines(@curr_result_file.path).each do |line|
        break if line_counter > lines_to_display
        puts line.gsub(/#{@column_chars}/n,"\t")
        line_counter = line_counter + 1
      end
    else
      puts Rainbow("Results File \"#{@curr_result_file.path}\" Does Not Exist").red
      exit
    end
  end
  
  def write_to_excel(filename)
    # Write This Result Set To Excel (xlsx)
    excel_filename = "#{$base_path}/DQL/Excel/#{filename}.xlsx"
    p = Axlsx::Package.new
    p.use_autowidth = true
    wb = p.workbook
    wb.add_worksheet(:name => "Results") do |sheet|
      File.open(@curr_result_file.path,"r").each_line do |line|
        sheet.add_row(line.split(/#{@column_chars}/n))
      end
    end
    p.serialize(excel_filename)
  end
  
  def revert_to_original_value
    # Revert Result Set To Original Value (Before Filters And Matches)
    if !@orig_result_file == @curr_result_file
      @curr_result_file.close
      @curr_result_file.unlink
    end
    @curr_result_file = @orig_result_file
    puts Rainbow("Result Set \"" + name + "\" Reverted To Original Data").green
    self
  end
  
  def filter_out(filter_query = "")
    # Filter Things Out Of The Result Set
    if @orig_result_file == @curr_result_file
      if not(File.exist?(@orig_result_file.path))
        puts Rainbow("ERROR: Result File \"#{@orig_result_file.path}\"Does Not Exist").red
        exit        
      end
      source = @orig_result_file
      destination = Tempfile.new('OpenODBCECL_Dest')
    else
      if not(File.exist?(@curr_result_file.path))
        puts Rainbow("ERROR: Result File \"#{@curr_result_file.path}\"Does Not Exist").red
        exit        
      end      
      source = @curr_result_file
      destination = Tempfile.new('OpenODBCECL_Dest')
    end
    # Create New Result File Base On The Matching Query
    output = File.open( destination.path,"a" )    
    File.readlines(source.path).each do |line|
      if line !~/#{filter_query}/i
        output << line
      end   
    end      
    output.close
    if @orig_result_file != @curr_result_file
      # Remove Intermediate Result File If It Is Not Equal To Original Results
      @curr_result_file.close
      @curr_result_file.unlink
    end
    # Set Current Result File To New Destination
    @curr_result_file = destination
    # Give Status
    puts Rainbow("Filter \"#{filter_query}\" Applied To Result Set \"" + name + "\"").green
    # Return self So You Can Chain Filters
    self      
  end
  
  def matching(matching_query = "")
    # Return Items Matching matching_query
    if @orig_result_file == @curr_result_file
      if not(File.exist?(@orig_result_file.path))
        puts Rainbow("ERROR: Result File \"#{@orig_result_file.path}\"Does Not Exist").red
        exit        
      end
      source = @orig_result_file
      destination = Tempfile.new('OpenODBCECL_Dest')
    else
      if not(File.exist?(@curr_result_file.path))
        puts Rainbow("ERROR: Result File \"#{@curr_result_file.path}\"Does Not Exist").red
        exit        
      end      
      source = @curr_result_file
      destination = Tempfile.new('OpenODBCECL_Dest')
    end
    # Create New Result File Base On The Matching Query
    output = File.open( destination.path,"a" )    
    File.readlines(source.path).each do |line|
      if line =~/#{matching_query}/i
        output << line
      end   
    end      
    output.close
    if @orig_result_file != @curr_result_file
      # Remove Intermediate Result File If It Is Not Equal To Original Results
      @curr_result_file.close
      @curr_result_file.unlink
    end
    # Set Current Result File To New Destination
    @curr_result_file = destination
    # Give Status
    puts Rainbow("Match Query \"#{matching_query}\" Applied To Result Set \"" + name + "\"").green
    # Return self So You Can Chain Filters
    self
  end
    
  def clean_up_result_files
    # Clean Up Result Files
    if !@orig_result_file.nil?
      if File.exist?(@orig_result_file.path)
        @orig_result_file.close
        @orig_result_file.unlink
      end
    end
    # @curr_result_file.path.nil? Necessary If @orig_result_file = @curr_result_file
    if !@curr_result_file.nil? and !@curr_result_file.path.nil?
      if File.exist?(@curr_result_file.path)
        @curr_result_file.close
        @curr_result_file.unlink
      end 
    end
  end
  
  def close
    clean_up_result_files
  end  
end

class ResultSetDictionary
  
  def initialize
    @result_set_hash = Hash.new
  end
  
  def result_set_exists?(key)
     if @result_set_hash[key].nil?
       return false
     else
       return true 
     end
  end
  
  def list_result_sets()
    puts "Existing Result Sets:"
    @result_set_hash.keys.each {|key| puts "    #{key}"}
  end
  
  def get_result_set(result_set_name)
    #puts @result_set_hash[result_set_name].value
    if result_set_exists?(result_set_name)
      return @result_set_hash[result_set_name]
    else
      puts "Result Set \"#{result_set_name}\" Does Not Exist"
    end 
  end
  
  def delete_result_set(result_set_name)
    if result_set_exists?(result_set_name)
      get_result_set(result_set_name).close
      @result_set_hash.delete(result_set_name)
      puts Rainbow("Result Set \"#{result_set_name}\" Is Deleted.").green
    else
      puts Rainbow("ERROR: Result Set \"#{result_set_name}\" Does Not Exist.").red
      return
    end
  end
  
  def add_existing_result_set(existing_result_set, result_set_name = 'default')
    # Given A ResultSet Add It To My Dictionary
    if result_set_exists?(result_set_name)
      puts Rainbow("ERROR: Result Set #{result_set_name} For Database Type #{existing_result_set.query_type} All Ready Exists.  Deleting Existing Key").red
      # return nil
      delete_result_set(result_set_name)
    end
    @result_set_hash.store(result_set_name,existing_result_set)
    return @result_set_hash[result_set_name]
  end
  
  def add_new_result_set(orig_result_file, result_set_name = 'default', database_type = '', column_chars = '|')
    # Given Proper Items Add A New Result Set To The Result Set Dictionary
    if result_set_exists?(result_set_name)
      puts Rainbow("ERROR: Result Set #{result_set_name} For Database Type #{database_type} All Ready Exists.  Deleting Existing Key").color(255,102,0)
      #return nil
      delete_result_set(result_set_name)
    end
    @result_set_hash.store(result_set_name,ResultSet.new(orig_result_file, query, database_type, result_set_name))
    return @result_set_hash[result_set_name]
  end
  
  def cleanup
    puts "Cleaning Up Result Sets"
    # Close All Result Sets
    @result_set_hash.keys.each {|key| puts "    Result Set \"#{key}\" Cleaned"; @result_set_hash[key].close}
    puts "Result Sets Cleaned"
  end
end

Basically these two classes let me reference result sets by name in my evaluation_code Treetop definitions.  Result sets are basically just temporary text files with my results in them (one record per line).  Filtering and matching are done by reading these files on line at a time and dumping appropriate lines into new temporary result set files.  These temporary files are cleaned up on program exit or when they are deleted.

This is the process I used to create my DQL meta programming language.  I was able to create the whole language using 634 lines of Treetop definitions and a 734 line program that contains all the database, result set and REPL logic.

Overall I am impressed with Treetop's capabilities.  More documentation would be nice but answers can be found on the Internet (with a lot of effort).  I'm hoping this will become another resource for people curious about this outstanding library.

Friday, March 06, 2015

I Always Think Of The Best Things To Say After I Leave

Last Friday I met with the folks from Pomiet Software here in Miamisburg, OH.  I was not looking to interview.  This opportunity was dropped in my Gmail account from a recruiter I had never talked with before.  The Pomiet group is a great group of people and are doing wonderful things it would have been fun to be a part of.

Part of the interview process was a coding challenge a few days before the interview.  Here is the challenge:

Hi and welcome to the StoreFront. As you know, we are a small store with a prime location in a prominent city ran by a friendly store manager named Sarah. We also buy and sell only the finest elements. Unfortunately, our items are constantly losing shelf value as they approach their sell by date. We have a system in place that updates our inventory for us. It was developed by a no-nonsense guy named Larry, who has moved on to new adventures. Your task is to add the new feature to our system so that we can begin selling a new category of items. First an introduction to our system:

        - All items have a ShelfLife value which denotes the number of days we have to sell the item

        - All items have a Worth value which denotes how valuable the item is
        - At the end of each day our system lowers both values for every item

Pretty simple, right? Well this is where it gets interesting:


        - Once the shelf life date has passed, Worth degrades twice as fast

        - The Worth of an item is never negative
        - "Gold" actually increases in Worth the older it gets
        - The Worth of an item is never more than 50
        - "Cadmium" is rare, has a worth of 80, and will never decrease in Worth
        - "Helium", like gold, increases in Worth as it's ShelfLife value changes; Worth increases by 2 when there are 10 days or less and by 3 when there are 5 days or less but Worth drops to 0 once the ShelfLife is passed.

We have recently signed an alchemist to create "Alchemy" items. This requires an update to our system:


        - "Alchemy" items degrade in Worth twice as fast as normal items

        - "Alchemy" items have a maximum worth of 100

Feel free to make any changes to the UpdateWorth method and add any new code as long as everything still works correctly. However, do not alter the Item class or Items property as those belong to another team that doesn’t believe in shared code ownership (you can make the UpdateWorth method and Items property static if you like, we'll cover for you).  If you happen to find any conflicts with the above requirements, we would appreciate if you fixed them.  


Just for clarification, an item can never have its Worth increase above 50, however "Cadmium" is a rare item and as such its Worth is 80 and it never alters.


PLEASE RETURN ALL OF THE FILES THAT YOU CREATE. FEEL FREE TO ZIP THE ENTIRE SOLUTION AND RETURN IT TO US.



    private void UpdateWorth()

    {
        for (var i = 0; i < Items.Count; i++)
        {
            if (Items[i].Name != "Gold" && Items[i].Name != "Helium")
            {
                if (Items[i].Worth > 0)
                {
                    if (Items[i].Name != "Cadmium")
                    {
                        Items[i].Worth = Items[i].Worth - 1;
                    }
                }
            }
            else
            {
                if (Items[i].Worth < 50)
                {
                    Items[i].Worth = Items[i].Worth + 1;

                    if (Items[i].Name == "Helium")

                    {
                        if (Items[i].ShelfLife < 11)
                        {
                            if (Items[i].Worth < 50)
                            {
                                Items[i].Worth = Items[i].Worth + 1;
                            }
                        }

                        if (Items[i].ShelfLife < 6)

                        {
                            if (Items[i].Worth < 50)
                            {
                                Items[i].Worth = Items[i].Worth + 1;
                            }
                        }
                    }
                }
            }

            if (Items[i].Name != "Cadmium")

            {
                Items[i].ShelfLife = Items[i].ShelfLife - 1;
            }

            if (Items[i].ShelfLife < 0)

            {
                if (Items[i].Name != "Gold")
                {
                    if (Items[i].Name != "Helium")
                    {
                        if (Items[i].Worth > 0)
                        {
                            if (Items[i].Name != "Cadmium")
                            {
                                Items[i].Worth = Items[i].Worth - 1;
                            }
                        }
                    }
                    else
                    {
                        Items[i].Worth = Items[i].Worth - Items[i].Worth;
                    }
                }
                else
                {
                    if (Items[i].Worth < 50)
                    {
                        Items[i].Worth = Items[i].Worth + 1;
                    }
                }
            }
        }
    }

    private IList Items = new List

                                    {
                                        new Item {Name = "Aluminum Shackles", ShelfLife = 10, Worth = 20},
                                        new Item {Name = "Gold", ShelfLife = 2, Worth = 50},
                                        new Item {Name = "Plutonium Pinball Parts", ShelfLife = 5, Worth = 7},
                                        new Item {Name = "Cadmium", ShelfLife = 0, Worth = 80},
                                        new Item {Name = "Helium", ShelfLife = 15, Worth = 38},
                                        new Item {Name = "Alchemy Iron", ShelfLife = 3, Worth = 75}
                                    };

class Item

{
    public string Name { get; set; }

    public int ShelfLife { get; set; }


    public int Worth { get; set; }

}

Spend as much time as you feel comfortable spending.  Typically you can expect to commit about 1 hour towards your solution.  I look forward to seeing the completed project.


It was in C# which I had not coded in for a few years.  That is no excuse for how badly I botched this job.  It had been a little while since I had delved into Design Patterns and object-oriented programming so I went with a naive procedural approach.  It was much better than the original (that wasn't hard) but it did not use object-oriented design, design patterns or anything approaching my best work.

Even worse when they asked me how I could expand my code to handle hundreds of items I choked in a big way (easy to do after a 2.5 hour interview I suppose).  I left the interview knowing that I had failed to get the job.  Much worse that this was the feeling that I did not really show them what I could do.

Sure enough, the recruiter dropped me an email on Sunday that said that they thought I was a fit culturally but I did not have the technical skills they were looking for in a team lead.  I was kind of devastated but that's life.  I knew I had not shown them how quickly I could learn the things they were looking for.  I had done a B- job at best in that interview.

I took away a few things from this process:

  • Refresh my object-oriented design knowledge
  • Learn more about agile software design
  • Learn Ruby - I have looked at it a few times in the past but one of the guys in the interview seemed really sold on it
Looking at that list I decided to learn Ruby and use it to refresh by object-oriented design skills.  Monday, I started looking at it and I am really amazed by how well Ruby gets out of your way.  It is really great.

Anyway, here is a non-instrumented solution to the above coding challenge in Ruby.  I did not put in the "business rules" for all of the items in the challenge.  I just put in enough to prove to myself that this would have been an acceptable solution:


class Inventory
  def initialize(*items)
    @inventory = []
    add_inventory_items(*items)
  end  
  
  def add_inventory_items(*items)
    items.each do |item|
      puts "Adding #{item.name} Inventory"
      @inventory.push(item)
    end 
  end
  
  def remove_inventory_items(*items)
    items.each do |item|
      puts "Removing #{item.name} Inventory"
      @inventory.remove(item)
    end 
  end
  
  def return_inventory_items
    @inventory
  end
  
  def to_s
    @inventory.each {|item| puts item}
    # Return empty string so that you do not get Inventory object printed at end of input
    '' 
  end
end

class Item

  attr_reader :name
  attr_accessor :shelflife, :worth
  
  def initialize(name, shelflife, worth)
    @name = name
    @shelflife = shelflife
    @worth = worth
  end
  
  def to_s
    "#{name} => Value Of #{worth} With Shelf Life Of #{shelflife}"
  end
end

class BusinessRule

  # &block is a block that takes at least one argument (the object variable)
  def initialize(block)
    @br_block = block
  end
  
  def apply_rule(object,*other_args)
    @br_block.call(object)
  end
    
end

class BusinessRules

  def initialize
    # Can be multiple Rule Types Per Object
    @rules = Hash.new{|hash, key| hash[key] = Array.new}
  end
  
  def add_rule(object,block)
    @rules[object.name].push(BusinessRule.new(block))
  end
  
  def apply_all_matching_rules(object,*other_args)
    @rules[object.name].each do |br|
      br.apply_rule(object,*other_args)
    end
  end
  
  def has_business_rules_for?(object)
    @rules.key?(object.name)
  end 
end

def subtract_one_day_from_shelflife(item)

  item.shelflife -= 1  
  if item.shelflife < 0 then
    item.shelflife = 0
  end
end

puts "Addng Initial Inventory Values"

gold = Item.new("Gold",15,50)
adamantium = Item.new("Adamantium",100,1000)
alchemy_sulfer = Item.new("Alchemy Sulfur",30,40)
store_inventory = Inventory.new(gold,adamantium,alchemy_sulfer)
puts "\nInventory Values Before Applying Business Rules:"
puts store_inventory

puts "Creating Business Rules For Gold"

daily_process_rules = BusinessRules.new()
daily_process_rules.add_rule(gold, lambda {|item| subtract_one_day_from_shelflife(item)})
daily_process_rules.add_rule(gold, lambda {|item| item.worth += 1})
daily_process_rules.add_rule(gold, lambda {|item| if item.worth < 50 then item.worth = 50 end})

puts "Adding Specialized Business Rules For Items With No Pre-Defined Rules"

# Add Generic Business Rules To Items That Have None 
store_inventory.return_inventory_items.each do |item|
  if item.name.downcase.include?("alchemy ") then
    puts "    Creating Business Rules For Alchemy Object \"#{item.name}\""
    daily_process_rules.add_rule(item, lambda {|item| subtract_one_day_from_shelflife(item)})
    daily_process_rules.add_rule(item, lambda {|item| if item.shelflife == 0 then item.worth -= 2 else item.worth -= 1 end})
    daily_process_rules.add_rule(item, lambda {|item| if item.worth < 0 then item.worth = 0 end})    
  end
  # Add Generic Business Rules To Item Because None Have Been Given
  if !(daily_process_rules.has_business_rules_for?(item)) then
    puts "    Adding Generic Business Rules to Item #{item.name}"
    daily_process_rules.add_rule(item, lambda {|item| subtract_one_day_from_shelflife(item)})
    daily_process_rules.add_rule(item, lambda {|item| item.worth -= 1})
    daily_process_rules.add_rule(item, lambda {|item| if item.worth < 0 then item.worth = 0 end})
  end
end

puts "\nApplying Daily Process Business Rules For All Inventory Items"

store_inventory.return_inventory_items.each do |item|
  puts "    Applying Daily Process Business Rules For #{item.name}"
  daily_process_rules.apply_all_matching_rules(item)
end

puts "\nInventory Values After Applying Business Rules:"

puts store_inventory

puts "Strings Could Be Placed In DataBase And Loaded As Business Rules Using Eval And Lambda"

it_works = eval "lambda {puts \"IT WORKS!!!\"}"
it_works.call

I think if I could have presented a design like this I probably would have gotten the job.  I am fairly happy at the job I have so I guess this is not the end of the world ;-).

Wednesday, October 01, 2008

Cloud Computing Is A Trap

Richard Stallman says that cloud computing is a trap and a hyped fad. It is only there to make you pay more over time and trap you in a proprietary format. In other news, the Internet is a fad that should be fading sometime soon.

Friday, September 19, 2008

Still Out Of Power

I've been out of power since Sunday at 4:30PM. I thought I would be going crazy by now but strangely I am enjoying being disconnected from all the technology I have in my life. Don't get me wrong, I will be happy to have television again. It just just isn't as critical as I thought it would be. Instead of zoning in front of the TV, I have begun reading in earnest again. I've read 3 books this week:

The Man With The Golden Torc
Street Of Shadows (Star Wars Coruscant Nights Book 2)
Against the Tide Of Years

It's amazing how much reading you can get done when you don't have the television distracting you.

Monday, July 21, 2008

Getting Back Into "Art" Again

I had some free time this last weekend and whipped this up. Maybe it will go good on a t-shirt ... maybe not ;-).

Wednesday, July 09, 2008

Baggage Mishandling

Hey wait. If the airlines are going to be charging me $15 for each checked bag, shouldn't they be doing a better job. 1/138 of all bags get lost forever. This statistic does not include luggage that gets to you late. If the airlines want to charge you money for something they used to give you for free, shouldn't they at least have to provide value? Just a thought.

Real Grief In A Virtual World

I just found out a long time SecondLife friend (Vx Shaw) passed away last month. I had not been in SL since last year. I had gotten really burnt out after 2.5 years of playing and needed a break. Weeks turned into months. I kept meaning to go back and check in but my home computer died and I just got it fixed. Now I'll never have a chance to talk to Vx (aka Bazoo Benton) again.

The things I remember best about V is her upbeat attitude and the endless support that she provided her friends. She let me camp my store out on Abydos for a year for free. She also always supported my many and sometimes wacky endeavors in-game. She will truly be missed.

Tuesday, July 08, 2008

Telecom Companies Need To Buy A Clue

I was just reading on law.com that telecoms are suing municipalities for rolling out free wifi to their constituents. These same telecoms often cannot be bothered to roll out any internet whatsoever to these municipalities. This is what happens when you base your business plans on artificial scarcity. You end up having to sue your own customers (i.e. the music and movie industries). Once your services are ubiquitous commodities you have to either find another business or employ economies of scale to make more money. Suing your customers is not a viable alternative ... ever. If you do so, do not be surprised when your customers walk all over you on their way towards a better alternative.