Go check out this website called Idiagram: The Art of Insight and Action specifically this page which not only shows a great diagram of complex systems thinking but also the impressive way to blend java-script with graphics to present a freeform user driven diagram.
I found this great website via the Anecdote blog entry
Have Fun
Paul
Monday, August 14, 2006
Thursday, August 03, 2006
Design or Evolve?
Here is an interesting couple of articles from Steve Jurvetson about weighting up the benefits of designing your algorithms or letting them evolve. Read the comments as well as they are pretty good.
More work is required in using Bayesian reasoning on evolved algorithms, essentially using the power of Bayesian to determine the set of information which you don't know and continuing working on whittling that set down.
Directed evolution would potentially help as well, dependent on understanding to the point of exhausion what your assumptions are in setting the direction or guidelines or rules (framework)
Have Fun
Paul
More work is required in using Bayesian reasoning on evolved algorithms, essentially using the power of Bayesian to determine the set of information which you don't know and continuing working on whittling that set down.
Directed evolution would potentially help as well, dependent on understanding to the point of exhausion what your assumptions are in setting the direction or guidelines or rules (framework)
Have Fun
Paul
Howto: copy of a production mysql database using mysqlhotcopy backup
Do you need to see what the data was like yesterday, last week in your tables?
Here are the steps to create a copy of a mysql database on the same machine, with no downtime.
Assumption: You have made backups using mysqlhotcopy and are using myisam tables.
Caveat: Make sure you copy the backup into the new directory, otherwise you will nuke the current database. You have been warned.
The reason I created this post was that I spent at least 30 minutes looking using google for a solution to this and whilst there were plenty articles about backups, nothing mentioned making a copy locally, on the same system.
Have Fun
Paul
Here are the steps to create a copy of a mysql database on the same machine, with no downtime.
Assumption: You have made backups using mysqlhotcopy and are using myisam tables.
Caveat: Make sure you copy the backup into the new directory, otherwise you will nuke the current database. You have been warned.
- Create a new directory in your mysql data directory (/var/lib/mysql): mkdir new_db
- Copy the backup you want into that new directory: cp /backup_dir/current_db/* new_db/
- Log into mysql and create the new database: CREATE DATABASE IF NOT EXISTS new_db;
- If any tables have been created since the backup you will have to create them in new_db, using this command: CREATE TABLE IF NOT EXIST new_db.table_name like current_db.table_name;
- Check/Validate all the tables in new_db are ok using command: CHECK TABLE new_db.table_name;
The reason I created this post was that I spent at least 30 minutes looking using google for a solution to this and whilst there were plenty articles about backups, nothing mentioned making a copy locally, on the same system.
Have Fun
Paul
Wednesday, August 02, 2006
How inflation happens or What inflation is
So what is inflation, how does inflation happen?
It is not increasing prices, that is the effect of the cause.
The cause is an increase in the amount of money in the system.
If you think of prices as ratios eg. a kilo bag of potatoes cost a certain number of dollars.
Prices are signals, signals of demand and supply.
Recently Australian bananas have been really expensive due to a couple of reasons
OK, so what is inflation, and what does banana prices have to do with the increase in prices of a whole swag of goods and services. Prices are still ratios so something has happened to either the supply of goods and services, or the supply of money on the other hand.
There is a increase in the money in the system (monetary inflation) which is causes the same or fewer goods to increase in price.
Where is this money coming from? it is happening because there is a demand for money which the Reserve Bank of Australia (RBA) is supplying to maintain its monetary policy (the interest rate).
Australia doesn't live in a closed world, we are affected by other countries demand for our goods as well. Hence when China, Japan, South Korea, Europe and the US want our commodities they will bid the price up, sending the signal that supply is restricted. This leads to a long chain of price signaling all the way through the supply chain.
Why is there such as demand for money at the moment. The RBA has had to increase the interest rate again to try and dampen the demand. See this statement.
This is rational behaviour from investors and everyone, if you believe that prices are going to be higher in the future then the goods and services are cheaper now then in the future...
The money you have now is going to be worth less in the future, so you exchange for something else.
If the price signal mechanism is constrained due to regulation, monopolies, or the supply is inelastic, eg. it takes 5-10 years to get a mine going, it takes 10 years to be a good doctor.
Then the rational thing to do is to demand as much money now to buy any goods and services which are available now or borrow to build to satisfy that demand. This is arbitration at its best.
I have said this in the past, the RBA is acting like an overbearing nanny! It believes it can calm down the arbitration process by making money more expensive. They are looking into the future and predicting some of the current demand is transient (caused by other arbitrators) and there will be a massive over-investment in supply.
Why is it rational to take as much money as you can get and buy goods or services?
Why is it rational to get rid of any money you get as quickly as possible?
Steps for being rational:
If you check in May 2006 year over year M1 increased by 8.98%!!
Have Fun
Paul
It is not increasing prices, that is the effect of the cause.
The cause is an increase in the amount of money in the system.
If you think of prices as ratios eg. a kilo bag of potatoes cost a certain number of dollars.
Prices are signals, signals of demand and supply.
Recently Australian bananas have been really expensive due to a couple of reasons
- Cyclone destroyed a large growing region in North Queensland.
- Imports are restricted due to quarantine requirements.
OK, so what is inflation, and what does banana prices have to do with the increase in prices of a whole swag of goods and services. Prices are still ratios so something has happened to either the supply of goods and services, or the supply of money on the other hand.
There is a increase in the money in the system (monetary inflation) which is causes the same or fewer goods to increase in price.
Where is this money coming from? it is happening because there is a demand for money which the Reserve Bank of Australia (RBA) is supplying to maintain its monetary policy (the interest rate).
Australia doesn't live in a closed world, we are affected by other countries demand for our goods as well. Hence when China, Japan, South Korea, Europe and the US want our commodities they will bid the price up, sending the signal that supply is restricted. This leads to a long chain of price signaling all the way through the supply chain.
Why is there such as demand for money at the moment. The RBA has had to increase the interest rate again to try and dampen the demand. See this statement.
This is rational behaviour from investors and everyone, if you believe that prices are going to be higher in the future then the goods and services are cheaper now then in the future...
The money you have now is going to be worth less in the future, so you exchange for something else.
If the price signal mechanism is constrained due to regulation, monopolies, or the supply is inelastic, eg. it takes 5-10 years to get a mine going, it takes 10 years to be a good doctor.
Then the rational thing to do is to demand as much money now to buy any goods and services which are available now or borrow to build to satisfy that demand. This is arbitration at its best.
I have said this in the past, the RBA is acting like an overbearing nanny! It believes it can calm down the arbitration process by making money more expensive. They are looking into the future and predicting some of the current demand is transient (caused by other arbitrators) and there will be a massive over-investment in supply.
Why is it rational to take as much money as you can get and buy goods or services?
Why is it rational to get rid of any money you get as quickly as possible?
Steps for being rational:
- I download this spreadsheet from the RBA
- I add some percentage changes over the last year for M1 (Currency and Currency in the bank) raw money.
- If I start at Jan 2006, I see that last year M1 increased by 6.95%!!
- Interest rates are 5.50%!!
- That is a negative interest rate! You leave the money in the bank and you will get nothing, worse you will have something which is potentially worth less at the end of the year.
If you check in May 2006 year over year M1 increased by 8.98%!!
Have Fun
Paul
Friday, July 21, 2006
Gapminder visualization of demographic info
Hat-tip to Catallarchy for this awesome visualizer of demographic info.
The ability to move through time takes a snapshot in time and turns it into a good indication of behaviour.
Try a movie (playing the data) of children per women (fertility rate) against life expectancy.
or
Child Mortality against Fertility rate.
People are rational.
Have Fun
Paul
Trackback
The ability to move through time takes a snapshot in time and turns it into a good indication of behaviour.
Try a movie (playing the data) of children per women (fertility rate) against life expectancy.
or
Child Mortality against Fertility rate.
People are rational.
Have Fun
Paul
Trackback
Monday, July 17, 2006
RSS and the subscribe model
There has been a idea (a meme) building for a while around the use of RSS (or atom) as a possible way of sharing and propagating information both internally and externally in a business.
Adam Bosworth has a great podcast on this subject when he was talking about Google and his version of a distributed DNS-like database.
As I was reviewing by blog aggregator (sage) for firefox I found another link to a page dedicated about RSS in the enterprise.
RSS builds on the acceptance of XML as a basis for sharing information between companies, there are various formats out there for your standard B2B solution. With Oracle getting into the vertical integration game and almost every major software vendor pushing Service Oriented Architecture (SOA) I am sure it won't be long before RSS in some form appears as a solution for retrieving data from diverse data sources.
This is the database replication model where RSS tells you what has changed so you get everything incrementally which is faster. You also could only get what you want to receive, pulling that data you want, or even what you are allowed to see, which provides security and privacy.
The possibilities are big... what about a way where you take one or more feeds add some value and pass that new feed onto as a new RSS. This is what people call mashups or remixing.
For all the database people this is like a view on one or more tables, your VIEW of the data which is feed to you.
From a monitoring point of view, wouldn't it be nice to be able to pull the data you want and forget the rest.
The last interesting idea is based on a metaphor from this article if you pass around XML, you could pass S-expressions, pass around s-expressions and you can pass not only data but code. I had an idea or thought, after reading that article, that maybe the data in a database is really code after all, the tables being just functions (or relations in the original sense), the data model actually reflective the function of the business (the business model) in abstraction.
So you could in fact use RSS not only to share data or a view of data but also share code or functions.
Have Fun
Adam Bosworth has a great podcast on this subject when he was talking about Google and his version of a distributed DNS-like database.
As I was reviewing by blog aggregator (sage) for firefox I found another link to a page dedicated about RSS in the enterprise.
RSS builds on the acceptance of XML as a basis for sharing information between companies, there are various formats out there for your standard B2B solution. With Oracle getting into the vertical integration game and almost every major software vendor pushing Service Oriented Architecture (SOA) I am sure it won't be long before RSS in some form appears as a solution for retrieving data from diverse data sources.
This is the database replication model where RSS tells you what has changed so you get everything incrementally which is faster. You also could only get what you want to receive, pulling that data you want, or even what you are allowed to see, which provides security and privacy.
The possibilities are big... what about a way where you take one or more feeds add some value and pass that new feed onto as a new RSS. This is what people call mashups or remixing.
For all the database people this is like a view on one or more tables, your VIEW of the data which is feed to you.
From a monitoring point of view, wouldn't it be nice to be able to pull the data you want and forget the rest.
The last interesting idea is based on a metaphor from this article if you pass around XML, you could pass S-expressions, pass around s-expressions and you can pass not only data but code. I had an idea or thought, after reading that article, that maybe the data in a database is really code after all, the tables being just functions (or relations in the original sense), the data model actually reflective the function of the business (the business model) in abstraction.
So you could in fact use RSS not only to share data or a view of data but also share code or functions.
Have Fun
Friday, July 07, 2006
SQL Server 2005: Creating V$ACTIVE_SESSION_HISTORY
Are you are DBA like me who works on both Oracle and SQL server (and MySQL)?
Do you miss Oracle performance views like v$session, v$sql and in Oracle 10G v$active_session_history?
I have created a series of tables and procedures which provide roughly the same functionality in SQL server 2005, basically mirroring the same view that is in Oracle 10G. I will backport the stuff to SQL server 2000 at some stage.
These scripts use the new dynamic management views (DMV) available to SQL Server 2005.
Specifically,
The steps to install this on your SQL Server 2005 instance
Note: Check the growth of your table to determine how much data you want to keep. As busy systems are likely to have many rows per sample.
E.g. Sampling every 2 secs will create 30 samples per minute, 1800 samples per hour. If you have on average 5 sessions running or waiting per sample that will be 9000 rows per hour.
I will provide some useful reports from the data you have collected in the next article.
The scripts are attached at the end of this article. Due to html parsing < as a tag you will have to check the purge script before running.
Do you miss Oracle performance views like v$session, v$sql and in Oracle 10G v$active_session_history?
I have created a series of tables and procedures which provide roughly the same functionality in SQL server 2005, basically mirroring the same view that is in Oracle 10G. I will backport the stuff to SQL server 2000 at some stage.
These scripts use the new dynamic management views (DMV) available to SQL Server 2005.
Specifically,
- sys.dm_exec_requests
- sys.dm_exec_sessions
The steps to install this on your SQL Server 2005 instance
- Copy and paste these scripts to a sql file.
- Open up SQL Server Management Studio.
- Select the instance and database where you want to create the tables. I suggest MSDB as the database or a dba specific database if you already have one.
- Open the file as a query.
- Execute the query. This will create the view, table and two stored procedures.
- Create a job which executes the stored procedure usp_ins_sql2005_ASH every n secs. It would be prudent to start with 3-5 secs and see how much overhead that causes.
- Create a job which executes the stored procedure usp_purge_sql2005_ASH. This stored procedure takes a parameter for the number of hours to keep.
Note: Check the growth of your table to determine how much data you want to keep. As busy systems are likely to have many rows per sample.
E.g. Sampling every 2 secs will create 30 samples per minute, 1800 samples per hour. If you have on average 5 sessions running or waiting per sample that will be 9000 rows per hour.
I will provide some useful reports from the data you have collected in the next article.
The scripts are attached at the end of this article. Due to html parsing < as a tag you will have to check the purge script before running.
/* Recreating V$ACTIVE_SESSION_HISTORY in SQL Server 2005
Version: 0.1
Created: Paul Moen 2006
*/
drop view uv_active_session_history
go
create view uv_active_session_history
as
select
getdate() as sample_time,
req.session_id,
req.sql_handle as sql_id,
req.plan_handle as sql_plan_hashvalue,
req.database_id,
req.user_id,
req.command as sql_opcode,
req.status as session_state,
req.blocking_session_id as blocking_session,
req.wait_type as event,
req.wait_time,
sess.program_name as program,
sess.client_interface_name as module
from sys.dm_exec_requests req join sys.dm_exec_sessions sess
on req.session_id = sess.session_id
where req.user_id <> 1
go
drop table active_session_history
go
select top 0 * into active_session_history
from uv_active_session_history
go
drop procedure usp_ins_sql2005_ASH
go
create procedure usp_ins_sql2005_ASH
as
insert into active_session_history
select * from uv_active_session_history
go
drop procedure usp_purge_sql2005_ASH
go
create procedure usp_purge_sql2005_ASH
@hour int
as
delete from active_session_history
where sample_time < dateadd(hh,-1*@hour,getdate())
go
Thursday, June 29, 2006
SQLserver 2005 query efficiency sql scripts
Until recently if you wanted to determine what queries or stored procedures were the least efficient you would have to capture all SQL either via SQL Profiler as described in these early posts on automating SQL Profiler and tuning using SQL Profiler or have a scheduled job which dumped the sql running from sysprocesses.
Now with the new dynamic performance views available in SQLserver 2005 you can run queries which allow you to determine these poor performers without much extra work.
As I noted in my last entry, the SQLserver 2005 team have a series of blogs and this inspired me to read and port my existing Oracle scripts which determine query efficiency to SQLserver 2005.
One of my favourite Oracle scripts uses the number of logical reads per execution as a good sign of poorly performing sql. Logical reads per execution is also a reasonable estimation of CPU per execution.
Normally on a poorly performing system I tend to follow these steps, in this case rewritten to use SQLserver 2005 new dynamic views.
Now with the new dynamic performance views available in SQLserver 2005 you can run queries which allow you to determine these poor performers without much extra work.
As I noted in my last entry, the SQLserver 2005 team have a series of blogs and this inspired me to read and port my existing Oracle scripts which determine query efficiency to SQLserver 2005.
One of my favourite Oracle scripts uses the number of logical reads per execution as a good sign of poorly performing sql. Logical reads per execution is also a reasonable estimation of CPU per execution.
Normally on a poorly performing system I tend to follow these steps, in this case rewritten to use SQLserver 2005 new dynamic views.
- Quickly check taskmanager or perfmon to verify that the CPU or IO hog is in fact SQLserver and not IIS or SQL fulltext indexing services (or something else).
- Check for contention, is there one process blocking all others.
- Run the script to find sql with the highest elapsed time per execution.
- Run the script to find sql with the highest physical reads (PIO) per execution.
- Run the script to find sql with the highest logical reads (LIO) per execution.
rem SQL Efficiency by Elapsed Time. Paul Moen 2006
select qs.creation_time
, qs.execution_count "Exec"
, qs.total_elapsed_time "Elapsed"
, total_physical_reads "PIO"
, total_logical_reads "LIO"
, round(qs.total_elapsed_time/qs.execution_count,1) "Time/Exec"
, round(qs.total_physical_reads/qs.execution_count,1) "PIO/Exec"
, round(qs.total_logical_reads/qs.execution_count,1) "LIO/Exec"
, st.text
from sys.dm_exec_query_stats as qs
cross apply sys.dm_exec_sql_text(qs.sql_handle) as st
where qs.execution_count > 0
order by "Time/Exec" desc
rem SQL Efficiency by Physical Reads per execution. Paul Moen 2006
select qs.creation_time
, qs.execution_count "Exec"
, qs.total_elapsed_time "Elapsed"
, total_physical_reads "PIO"
, total_logical_reads "LIO"
, round(qs.total_elapsed_time/qs.execution_count,1) "Time/Exec"
, round(qs.total_physical_reads/qs.execution_count,1) "PIO/Exec"
, round(qs.total_logical_reads/qs.execution_count,1) "LIO/Exec"
, st.text
from sys.dm_exec_query_stats as qs
cross apply sys.dm_exec_sql_text(qs.sql_handle) as st
where qs.execution_count > 0
order by "PIO/Exec" desc
rem SQL Efficiency by Logical Reads per execution
which is good estimate for CPU/execution. Paul Moen 2006
select qs.creation_time
, qs.execution_count "Exec"
, qs.total_elapsed_time "Elapsed"
, total_physical_reads "PIO"
, total_logical_reads "LIO"
, round(qs.total_elapsed_time/qs.execution_count,1) "Time/Exec"
, round(qs.total_physical_reads/qs.execution_count,1) "PIO/Exec"
, round(qs.total_logical_reads/qs.execution_count,1) "LIO/Exec"
, st.text
from sys.dm_exec_query_stats as qs
cross apply sys.dm_exec_sql_text(qs.sql_handle) as st
where qs.execution_count > 0
order by "LIO/Exec" desc
Sunday, June 25, 2006
SQLserver 2005 team blogs
I have been scanning some blogs by different teams involved with SQLserver 2005.
Here is a short list
This article from the database engine blog about finding the top N worse SQL brought a (bemused) smile to my face. Finally SQLserver has views like Oracle, no more do you need to run SQL profiler 24x7 or some script which captures the SQL running every n secs to have an historical record of what has been running.
Guess this means I can start porting my Oracle scripts from using v$sql, v$sqlarea and in Oracle 10G R2 v$active_session_history to using sys.dm_exec_query_stats, sys.dm_exec_sql_text and sys.dm_exec_query_plan.
The trouble with plenty of relational databases has been that lack of exposure of the metadata/catalog of the database and the data within that catalog. Until recently plenty of the internal stuff in SQLserver had to be queried using various DBCC calls. Similarly, this is the same stuff that MySQL versions prior to 5 have with SHOW TABLE STATUS and SHOW FULL PROCESSLIST etc.
There is no nice way to see what the spread of data within a column is. It is good that these vendors are exposing this to the outside world via SQL rather than a tool with requires the output to be further parsed.
Have Fun
Here is a short list
- SQLserver 2005 Database engine tips
- SQLserver 2005 Query Optimizer
- SQLserver Query optimization - talking about SQL/TSQL optimization
- SQLserver storage Engine
This article from the database engine blog about finding the top N worse SQL brought a (bemused) smile to my face. Finally SQLserver has views like Oracle, no more do you need to run SQL profiler 24x7 or some script which captures the SQL running every n secs to have an historical record of what has been running.
Guess this means I can start porting my Oracle scripts from using v$sql, v$sqlarea and in Oracle 10G R2 v$active_session_history to using sys.dm_exec_query_stats, sys.dm_exec_sql_text and sys.dm_exec_query_plan.
The trouble with plenty of relational databases has been that lack of exposure of the metadata/catalog of the database and the data within that catalog. Until recently plenty of the internal stuff in SQLserver had to be queried using various DBCC calls. Similarly, this is the same stuff that MySQL versions prior to 5 have with SHOW TABLE STATUS and SHOW FULL PROCESSLIST etc.
There is no nice way to see what the spread of data within a column is. It is good that these vendors are exposing this to the outside world via SQL rather than a tool with requires the output to be further parsed.
Have Fun
Friday, June 16, 2006
Interesting web articles
I am getting back into the groove, catching up on all those interesting blog entries and web articles which I missed in Canada, not because I didn't have internet access, just my whole RSS feed reader wasn't there, neither were my bookmarks/favourites. My setup is the result of many hours of filtering which I don't need to spend time remembering (unless I need to).
I am working from home in the interim, which has been great, depending on what I am doing I enjoy working or studying to some music. As I have mentioned in the past pandora provides a great resource of music. I tend to listen to a genre called vocal trance which is basically dance music, other times it might be classical, jazz or even metal in various forms. I like my music to be packed with plenty of action and depth.
Here are couple of articles worth reading:I find the area on constraint programming interesting, there is a prolog Sudoku solver as well which uses a similar method using prolog constraints to solve the puzzles. The idea that the program moves forward and reduces the problem set might be useful in tuning situations where the need to eliminate non-issues (sometimes in a hurry) is important.
Lots of developers have issues with using constraints in databases, mainly relating to error handling. That is kinda weird as the database model is really an expression of a business or application model. i.e. a sale must have a product and a customer.
Foreign key, unique and check constraints allow the database to police those business rules (or constraints) so that the data remains accurate, without depending on any application.
In some sense the data model provides insight into the business. What problem (for the customer) is business trying to solve...
Have Fun
I am working from home in the interim, which has been great, depending on what I am doing I enjoy working or studying to some music. As I have mentioned in the past pandora provides a great resource of music. I tend to listen to a genre called vocal trance which is basically dance music, other times it might be classical, jazz or even metal in various forms. I like my music to be packed with plenty of action and depth.
Here are couple of articles worth reading:I find the area on constraint programming interesting, there is a prolog Sudoku solver as well which uses a similar method using prolog constraints to solve the puzzles. The idea that the program moves forward and reduces the problem set might be useful in tuning situations where the need to eliminate non-issues (sometimes in a hurry) is important.
Lots of developers have issues with using constraints in databases, mainly relating to error handling. That is kinda weird as the database model is really an expression of a business or application model. i.e. a sale must have a product and a customer.
Foreign key, unique and check constraints allow the database to police those business rules (or constraints) so that the data remains accurate, without depending on any application.
In some sense the data model provides insight into the business. What problem (for the customer) is business trying to solve...
Have Fun
Sunday, June 11, 2006
Back to Sydney
My three weeks in Canada is almost over.
I am killing a couple of hours in Vancouver before heading back to Sydney.
If you get a chance go and have a read of the Pythian blog. Like I said before these blokes and gals are smart. Christo is a damn fine Starcraft player as well. I thought I was reasonable, but hey 6 minutes and its over is really really good.
I am going to being spending time getting the Australian office happening and then Pythian goes 24x5!!
My time is limited again.
Craig Sham. Orapref has done some interesting work on performance forecasting. This I believe is the next tuning wave, after ratios,waits and response time.
Hotsos is doing some fine stuff with queuing theory as well.
Enough for now. I will be back on Aussie time in about 20 hours
Have Fun
I am killing a couple of hours in Vancouver before heading back to Sydney.
If you get a chance go and have a read of the Pythian blog. Like I said before these blokes and gals are smart. Christo is a damn fine Starcraft player as well. I thought I was reasonable, but hey 6 minutes and its over is really really good.
I am going to being spending time getting the Australian office happening and then Pythian goes 24x5!!
My time is limited again.
Craig Sham. Orapref has done some interesting work on performance forecasting. This I believe is the next tuning wave, after ratios,waits and response time.
Hotsos is doing some fine stuff with queuing theory as well.
Enough for now. I will be back on Aussie time in about 20 hours
Have Fun
Sunday, June 04, 2006
Kaplan-Meier analysis
The blokes at Pythian are damn smart.
Paul Vallee mentioned the use of a statistical analysis of survival. Specifically Kaplan-Meier survival analysis.
After Paul explained how he was using it, I thought of other areas where you could use it. It comes down to how or what you class as being survival and death.
Anyway I have to go. Going to play computer games all afternoon. Should be good.
Have Fun
Paul Vallee mentioned the use of a statistical analysis of survival. Specifically Kaplan-Meier survival analysis.
After Paul explained how he was using it, I thought of other areas where you could use it. It comes down to how or what you class as being survival and death.
- Database performance. At what point does the performance hang (or die) and what is the mean time of survival. Survival could be a reasonable response time for a known query.
- Stock survival. The ASX whilst having large blue chip stocks continues a lot of speculative mining and biotech startups. It would be interesting to run some survival analysis over them.
Anyway I have to go. Going to play computer games all afternoon. Should be good.
Have Fun
Sunday, May 28, 2006
First week in Canada or cleaning internet cafe computers
Finally I have a chance to use a computer without the threat of spyware,adware and rootkits. I thought I could use a internet cafe a couple of blocks from where I am staying, but the last visit required 40 minutes of downloading spybot, ad-aware, firewalls and sysinternals rootkit revealer just to make the machine ok to use. It was affected with spyware and adware badly.
Anyway I chose to come into the Pythian office and use the computer I have been given for the duration of my stay as I know it is secure.
So it is Saturday morning Canadian (EDT) time just before lunchtime.
The first week has been busy, the learning curve steep, but I have jumped into handling the daily processes and alerts as a way to get up to speed as quickly as possible. Doing actual work as opposed to reading documentation and having meetings (whilst important) is a much faster way to "get a grip" on the new work environment, processes, clients etc.
The system that Pythian have for handling the large number of clients and larger number of databases, client -> databases is a one to many relationship :) is very very good.
Like any successful company, systems and processes are one of the keys to allowing the company to expand and also provide a method of storing or maintaining what is called organizational learning or knowledge. This way if someone leaves the company, the knowledge doesn't leave as well.
I won't post much in way of the new direction and database related posts until I get back to Australia.
More later, have fun.
Anyway I chose to come into the Pythian office and use the computer I have been given for the duration of my stay as I know it is secure.
So it is Saturday morning Canadian (EDT) time just before lunchtime.
The first week has been busy, the learning curve steep, but I have jumped into handling the daily processes and alerts as a way to get up to speed as quickly as possible. Doing actual work as opposed to reading documentation and having meetings (whilst important) is a much faster way to "get a grip" on the new work environment, processes, clients etc.
The system that Pythian have for handling the large number of clients and larger number of databases, client -> databases is a one to many relationship :) is very very good.
Like any successful company, systems and processes are one of the keys to allowing the company to expand and also provide a method of storing or maintaining what is called organizational learning or knowledge. This way if someone leaves the company, the knowledge doesn't leave as well.
I won't post much in way of the new direction and database related posts until I get back to Australia.
More later, have fun.
Saturday, May 20, 2006
Blogging from Canada
As part of my new role I am going to meet the Pythian team based in Ottawa, Canada. It will be great to meet the guys and girls face to face.
So at the moment I am sitting in the departure area of Vancouver Airport waiting for the connection to Ottawa. I have time to burn so I thought it might be fun to write an article from an airport internet terminal. Humour me...
I have a couple of ideas for new articles and maybe a extra direction for the site relating to ASX listed stocks.
Some ideas I want to pursue in the near future are
So at the moment I am sitting in the departure area of Vancouver Airport waiting for the connection to Ottawa. I have time to burn so I thought it might be fun to write an article from an airport internet terminal. Humour me...
I have a couple of ideas for new articles and maybe a extra direction for the site relating to ASX listed stocks.
Some ideas I want to pursue in the near future are
- Using Analytical functions in Oracle in combination with Oracle statistical gathering to provide indepth and potential a new way to view performance metrics.
- Use the Oracle Data Miner and prediction functions to actually produce predictive performance analysis based on the Oracle stats.
- Review the state of SQLserver dates as stored by the datetime datatype. We have seen some weird dates which do not match what the online books (doco) say are the limits.
- Further review and detailed outlines of installing Oracle on Ubuntu.
- Expansion on the dead money comparsions (rent vs mortgage interest) to truly show the tax effective nature of buying and leasing for the first 7-10 years of the loan.
- Build on my expanding knowledge of Analytics to produce some unique reports on ASX listed stocks.
I am also going to write some blog articles for Pythian on their blog once I get going. The first one relates to extracting a complete copy of the CTXSYS to faciliate moving a schema with text or domain indexes.
Plenty of work ahead and as I mentioned, the more popular a subject the more I will use that to guide the articles I write in the future.
My time is limited to the coins I have so I will sign off.
Have Fun
Paul
Monday, May 15, 2006
Google Ads and feed links
All those returning visitors would have noticed that there now google ads and some RSS feeder links.
This is an exercise in reviewing and working with the various web technologies rather than a chance to make any money. Given the site cracked 300 visitors this month (as measured by statcounter.com) you see what I mean.
I have been playing around with including google ads for a while. I might also upgrade the amazon stuff and actually start writing some reviews for the books I have read in my lifetime, although most of those books are not related to the theme of this blog.
The most popular pages of this blog are:
Given the signals I am receiving from the market (of web surfers and search engine results), I should start writing some more about those areas. Signal to Noise ratio is a bit low at the moment...
Have Fun
Paul
This is an exercise in reviewing and working with the various web technologies rather than a chance to make any money. Given the site cracked 300 visitors this month (as measured by statcounter.com) you see what I mean.
I have been playing around with including google ads for a while. I might also upgrade the amazon stuff and actually start writing some reviews for the books I have read in my lifetime, although most of those books are not related to the theme of this blog.
The most popular pages of this blog are:
- Oracle on Ubuntu
- Downgrading Oracle Enterprise Edition to Standard Edition
- Comparison of Dead Money - Rent vs Mortgage Interest
Given the signals I am receiving from the market (of web surfers and search engine results), I should start writing some more about those areas. Signal to Noise ratio is a bit low at the moment...
Have Fun
Paul
Sunday, May 14, 2006
Market Profile perl script
I posted ages ago on a ASX stockmarket forum that I had developed a perl script to display a market profile.
What is a market profile?
Anyway I still get requests for perl script via email even though that post was back in 2001.
As part of the move away from perl to SQL for other stuff, I am going to load my tick data into my OracleXE database and I should be able to rewrite the perl script as SQL then rewrite as a function.
If you want a copy of the perl script, email me. roobaron-/*?at*?/-yahoo.com. Take out the regex breaking bit.
or
Cut and paste the script from here
Have Fun
Paul
### CUT HERE ###
# Market Profile
# Paul Moen 05/12/2001
# Explanation:
# Takes csv file from Weblink and converts into market profile
# Traditionally market profile was split on 30 minute increments
# Each 1/2 hour is given a letter starting at A
# Range is determined by Price based on ASX tick movements
# i.e <> 1.00 increments 0.01
# So. each TP (time point) will be 5 ticks. eg price > $1.00 TP = 0.05
# Program logic based on manual procedure. i.e as data arrives time is checked
# and price is checked to see if resides with existing TP or needs a new TP
# Read the attached readme_market_profile.txt for more info on the program logic.
# Read csv file piped via sort (sorting on price)
# $title =;
# Format of file is expected to be like this
# No, Date, Time, price, volume, sale, C Code
# Therefore @line[3] (4th element) is the price.
# Set variables
$tpr_flag = 0;
$last_price = 0;
$period_change_flag = 0;
$tp_list = '';
@mp_list = '';
while ($line =) {
chop($line);
@line = split(/,/,$line); # split line into array elements.
# grab the date from 2nd element
$date = @line[1];
# Grab the time and determine what letter it is
# Note: Any time before 10am is ignored and time after 4.30pm is ignored
# Letter increment by 1/2 hour
# Now grab the price
$price = @line[3];
# We are going to calculate the tpr only once so check to see if it set already
if ($tpr_flag != 1){
# tpr is the time price range
if ($price > 1.00){
# tick is 0.01, tpr is 0.05
$tpr = 0.05;
$tpr_flag = 1;
} elsif ($price <> 0.10){
# tick is 0.005, tpr is 0.025
$tpr = 0.025;
$tpr_flag = 1;
} elseif ($price < tpr =" 0.005;" tpr_flag =" 1;" time =" @line[2];" period =" 'A';" period =" 'B';" period =" 'C';" period =" 'D';" period =" 'E';" period =" 'F';" period =" 'G';" period =" 'H';" period =" 'I';" period =" 'J';" period =" 'K';" period =" 'L';" period =" 'M';" period =" 'N/A';" period_change_flag =" 1;" period_change_flag =" 0;" last_price ="="" tp_list =" $price;" mp_length =" push(@mp_list,$tp_list);" last_price =" $price;" price_ge =" $last_price" price_le =" $last_price" the =","> or <> $price_le) {
# Price is within tpr and no change to , so no change to list
} elsif ( $price > $price_ge ) {
# price is greater than the price + tpr
$tp_list = $price_ge; # Price is in new tpr
$tp_list .= ' ';
$tp_list .= $period;
# push tp_list onto the top of mp_list (main list)
# @mp_list[length -1] becomes the last point.
$mp_length = push(@mp_list,$tp_list);
$last_price = $price;
# Calculate the timepoint range tpr
$price_ge = $price_ge + $tpr;
$price_le = $price_le - $tpr;
} elsif ($price < $price_le) { # price is less than price - tpr $tp_list = $price_le; # Price is in new tpr $tp_list .= ' '; $tp_list .= $period; # shift tp_list onto the bottom of the mp_list (main list) # @mp_list[0] becomes the start. $mp_length = shift(@mp_list,$tp_list); $last_price = $price; # Calculate the timepoint range tpr $price_ge = $price_ge + $tpr; $price_le = $price_le - $tpr; } else { print "something is wrong\n"; } } else { # Period is not equal to A need to work with list of scalars if (($price < $price_ge && $price > $price_le) {
# Price is in current tpr but the period is different, there should be
# a tp already in existence which requires an additional tp to be added.
} # period A check
} # everything else
$current_period = $period;
} # end of while
# end of market_profile.pl
### CUT HERE ###
What is a market profile?
- Here are links to the CBOT education site with another link to Market Profile 101
- In a nutshell, a market profile seeks to show the distribution of prices intraday from tick or trade-by-trade data.
- It is simple to draw, which allowed floor traders to draw (before portable devices)
- It can reveal a difference trading pattern to what the standard OHLC price bar suggests.
Anyway I still get requests for perl script via email even though that post was back in 2001.
As part of the move away from perl to SQL for other stuff, I am going to load my tick data into my OracleXE database and I should be able to rewrite the perl script as SQL then rewrite as a function.
If you want a copy of the perl script, email me. roobaron-/*?at*?/-yahoo.com. Take out the regex breaking bit.
or
Cut and paste the script from here
Have Fun
Paul
### CUT HERE ###
# Market Profile
# Paul Moen 05/12/2001
# Explanation:
# Takes csv file from Weblink and converts into market profile
# Traditionally market profile was split on 30 minute increments
# Each 1/2 hour is given a letter starting at A
# Range is determined by Price based on ASX tick movements
# i.e <> 1.00 increments 0.01
# So. each TP (time point) will be 5 ticks. eg price > $1.00 TP = 0.05
# Program logic based on manual procedure. i.e as data arrives time is checked
# and price is checked to see if resides with existing TP or needs a new TP
# Read the attached readme_market_profile.txt for more info on the program logic.
# Read csv file piped via sort (sorting on price)
# $title =
# Format of file is expected to be like this
# No, Date, Time, price, volume, sale, C Code
# Therefore @line[3] (4th element) is the price.
# Set variables
$tpr_flag = 0;
$last_price = 0;
$period_change_flag = 0;
$tp_list = '';
@mp_list = '';
while ($line =
chop($line);
@line = split(/,/,$line); # split line into array elements.
# grab the date from 2nd element
$date = @line[1];
# Grab the time and determine what letter it is
# Note: Any time before 10am is ignored and time after 4.30pm is ignored
# Letter increment by 1/2 hour
# Now grab the price
$price = @line[3];
# We are going to calculate the tpr only once so check to see if it set already
if ($tpr_flag != 1){
# tpr is the time price range
if ($price > 1.00){
# tick is 0.01, tpr is 0.05
$tpr = 0.05;
$tpr_flag = 1;
} elsif ($price <> 0.10){
# tick is 0.005, tpr is 0.025
$tpr = 0.025;
$tpr_flag = 1;
} elseif ($price < tpr =" 0.005;" tpr_flag =" 1;" time =" @line[2];" period =" 'A';" period =" 'B';" period =" 'C';" period =" 'D';" period =" 'E';" period =" 'F';" period =" 'G';" period =" 'H';" period =" 'I';" period =" 'J';" period =" 'K';" period =" 'L';" period =" 'M';" period =" 'N/A';" period_change_flag =" 1;" period_change_flag =" 0;" last_price ="="" tp_list =" $price;" mp_length =" push(@mp_list,$tp_list);" last_price =" $price;" price_ge =" $last_price" price_le =" $last_price" the =","> or <> $price_le) {
# Price is within tpr and no change to , so no change to list
} elsif ( $price > $price_ge ) {
# price is greater than the price + tpr
$tp_list = $price_ge; # Price is in new tpr
$tp_list .= ' ';
$tp_list .= $period;
# push tp_list onto the top of mp_list (main list)
# @mp_list[length -1] becomes the last point.
$mp_length = push(@mp_list,$tp_list);
$last_price = $price;
# Calculate the timepoint range tpr
$price_ge = $price_ge + $tpr;
$price_le = $price_le - $tpr;
} elsif ($price < $price_le) { # price is less than price - tpr $tp_list = $price_le; # Price is in new tpr $tp_list .= ' '; $tp_list .= $period; # shift tp_list onto the bottom of the mp_list (main list) # @mp_list[0] becomes the start. $mp_length = shift(@mp_list,$tp_list); $last_price = $price; # Calculate the timepoint range tpr $price_ge = $price_ge + $tpr; $price_le = $price_le - $tpr; } else { print "something is wrong\n"; } } else { # Period is not equal to A need to work with list of scalars if (($price < $price_ge && $price > $price_le) {
# Price is in current tpr but the period is different, there should be
# a tp already in existence which requires an additional tp to be added.
} # period A check
} # everything else
$current_period = $period;
} # end of while
# end of market_profile.pl
### CUT HERE ###
Wednesday, May 03, 2006
Commentary of RBA Interest rate rise
The Reserve Bank of Australia (RBA) decided to raise its money market rate by 25 basis (0.25%) to 5.75%. It also released a media statement injunction with the announcement.
A quick summary (although you should go and read it)
If the business growth is coming a stronger export sector that sector will be getting its demand signals and its increasing cash flows from overseas consumers.
If those consumers are intermediate producers or producers of consumer goods for export, those producers will be taking their demand signals, stronger cashflows and the ability to borrow more from their overseas customers.
The RBA can't win, it can't set the Chinese interest rate or US interest rate or Japanese rate or EU rate, or stop individuals or companies fulfilling demand either domestically or internationally.
Instead it acts like a overbearing nanny, scolding businesses for borrowing to invest (in capital) so they can expand production, so they can produce and spend more.
The headlines have all been about poor old Australian consumer who has the massive mortgage and how the rate hike will make things harder and it will be for highly leverage consumers.
The consumer is collateral damage, this rate rise is aimed directly at reducing business borrowings for investment. When businesses cough, consumers catch a cold.
What the RBA has done is make it harder for Australian businesses to expand their production in the face of increased demands. This will limit the ability of those businesses to supply both domestic and international customers, therefore limiting their ability to spend but also limiting their ability to increase profits and most likely productivity.
The RBA is worried about price inflation, however by restricting the ability to increase supply, when demand is increasing, it's decision will cause prices to rise. That is what PRICE increases are signaling to businesses, INCREASE supply, INVEST in production.
The RBA like any other public institution needs to be seen to act otherwise its reason for existing risks being questioned. In increasing or decreasing the rate, the RBA changes the whole production structure within Australia, the structure it most likely can't fully comprehend.
It would be better leaving the rate stable, never again changing it. This would allow businesses to quit worrying about the affect of changes in interest rates on their profits and ability to expand and allow those businesses to make decisions based on price signals. Businesses try to achieve some stability by hedging the interest rates however most hedges are relatively short term.
If the RBA said that the rate was not going to change in 30 years, that stability would allow businesses to make much longer range decisions, enabling longer chains of production to emerge. Instead it plays around with the rate, decreasing it to increase the attractiveness of borrowing, then having remorse two or three years later and deciding to increase it again.
The joys of having a single Command and Control institution controlling the most important price signal of all i.e. the price of money.
Have Fun
A quick summary (although you should go and read it)
- International growth is driving strong growth in business profits, which is enabling business to invest and borrow on the strength of those increased cashflows.
- Domestic consumers are able to borrow more as wage growth increases borrowing capacity.
- Wage growth is driven by business expansion.
- The RBA thinks the current borrowing is too much and/or too quick and wants to dampen the demand for credit by increasing the rate.
If the business growth is coming a stronger export sector that sector will be getting its demand signals and its increasing cash flows from overseas consumers.
If those consumers are intermediate producers or producers of consumer goods for export, those producers will be taking their demand signals, stronger cashflows and the ability to borrow more from their overseas customers.
The RBA can't win, it can't set the Chinese interest rate or US interest rate or Japanese rate or EU rate, or stop individuals or companies fulfilling demand either domestically or internationally.
Instead it acts like a overbearing nanny, scolding businesses for borrowing to invest (in capital) so they can expand production, so they can produce and spend more.
The headlines have all been about poor old Australian consumer who has the massive mortgage and how the rate hike will make things harder and it will be for highly leverage consumers.
The consumer is collateral damage, this rate rise is aimed directly at reducing business borrowings for investment. When businesses cough, consumers catch a cold.
What the RBA has done is make it harder for Australian businesses to expand their production in the face of increased demands. This will limit the ability of those businesses to supply both domestic and international customers, therefore limiting their ability to spend but also limiting their ability to increase profits and most likely productivity.
The RBA is worried about price inflation, however by restricting the ability to increase supply, when demand is increasing, it's decision will cause prices to rise. That is what PRICE increases are signaling to businesses, INCREASE supply, INVEST in production.
The RBA like any other public institution needs to be seen to act otherwise its reason for existing risks being questioned. In increasing or decreasing the rate, the RBA changes the whole production structure within Australia, the structure it most likely can't fully comprehend.
It would be better leaving the rate stable, never again changing it. This would allow businesses to quit worrying about the affect of changes in interest rates on their profits and ability to expand and allow those businesses to make decisions based on price signals. Businesses try to achieve some stability by hedging the interest rates however most hedges are relatively short term.
If the RBA said that the rate was not going to change in 30 years, that stability would allow businesses to make much longer range decisions, enabling longer chains of production to emerge. Instead it plays around with the rate, decreasing it to increase the attractiveness of borrowing, then having remorse two or three years later and deciding to increase it again.
The joys of having a single Command and Control institution controlling the most important price signal of all i.e. the price of money.
Have Fun
Monday, May 01, 2006
Downgrading from Enterprise Edition to Standard Edition
The business decided that one of the databases that I support, that there was no requirement to run Enterprise Edition. So today and tomorrow morning early we (Junior DBA and I) are going to through the process of downgrading to Standard Edition.
Here are the steps Oracle recommends:
A couple of gotchas when using dbca to create the database.
Here are the steps Oracle recommends:
- Export the whole database to file using exp or expdb, depending on the version.
- Deinstall the Enterprise Edition using Oracle installer.
- Reinstall the Standard Edition software using Oracle software.
- Create a new database and import the data from export file.
- Install Standard Edition in a different Oracle home and different Oracle home path eg. $ORACLE_HOME becomes /oracle/product/920
- Create a new database using dbca or from scripts. Applying all necessary patches.
- Export the whole database from old database.
- Shutdown old database.
- Import into new database.
- Switch listener to point at new database.
A couple of gotchas when using dbca to create the database.
- LOG_ARCHIVE_DEST_n doesn't work for Standard Edition. This parameter was set by the dbca in its standard init.ora file. This parameter setting returns the error ORA-439 feature not enabled: Managed Standby. Use LOG_ARCHIVE_DEST instead for Standard Edition.
- When using ssh -X hostname you can't use su or sudo to change to oracle. Tunneling X through ssh (the -X option) requires ssh -X oracle@hostname to get the right DISPLAY set.
Thursday, April 27, 2006
Trade deficit and Foreign Debt - 6
It has been a while since I wrote something about Australia's Trade Deficit and Foreign debt. In fact that last article was way back in June 2005!!
The Aus. Bureau of Stats. (ABS) released the figures for International Trade in Goods and Services February 2006 recently, and the original figures are so close to balanced, there was a minor $AUD 105Million in it.
This increase in Goods credits is still driven by the demand for ore and minerals. Australia's commodity (BBQ) economy continues the provide the fuel for growing economies. Almost all the goods were up by at least 12-15% over the last month's numbers suggesting that the numbers were adversely affected by the shorter February month.
If you review the pdf you can see the tables for the original, the deficit gets close to balance and then extends again. Civilian Aircraft makes an important impact of the debit side of the ledger, however fuel debits as up 43% over 8 months compared to last February.
No signing the K. Protocol means that Australia has not been penalised for having a comparative advantage in coal, given coal and coking coal is now by far our largest export.
The other stuff hasn't changed much as the US is still our biggest trading partner and the partner which we run the highest trade deficit with.
For all the worry and concern about China destroying our manufacturing, it is benign with the overall trend in our favour, our exports to China rose $AUD 3.6 Billion, 48% over 8 months (Feb 05 to Feb 06) whereas our imports only rose $AUD 1.9 Billion, 15% over the same period.
A quick check of the finances saw non-financial corporations borrowing heavily, this is most likely to pay for expansion to fuel the demand. An early signal that corporations feel they have expanded enough would be an continued fall in demand for imported capital goods, although the original series still shows that the continued demand for what would seem to be mining related equipment. This might also be seen in a reduction for corporate credit, although this could last longer as corporations use strong cash flows to borrow for merger and acquisition (M&A) to continue to expand.
Takeover fever really hasn't touched the booming mining industry as high prices allow companies to revisit unexplored leases both in Australia and elsewhere, rather than choosing to buy working mines to booster production whilst keeping costs at bay. M&A is happening in areas which are feeling the pinch on margins.
Mining costs will increase as lower yield ores are sourced to fill existing orders i.e. the mine life can be extended as most calculations are based on a specific price, increases in price make mining lower grade ores possible and if the company is smart it will mine those lower grade ores now at a higher cost whilst saving the lower cost, higher yielder ores for when prices drop. The aim afterall is extending the life and profit of the mine for as long as possible. This method is more longer term, ending up with more profit at the end, rather than having massive profits now at the expense of few or none later. I am sure there are companies doing both...
Have Fun
Previous articles:
Trade Deficit and Foreign Debt -1
Trade Deficit and Foreign Debt -2
Trade Deficit and Foreign Debt -3
Trade Deficit and Foreign Debt -4
Trade Deficit and Foreign Debt - 5
Related Articles:
Australian Trade Partners
The Aus. Bureau of Stats. (ABS) released the figures for International Trade in Goods and Services February 2006 recently, and the original figures are so close to balanced, there was a minor $AUD 105Million in it.
This increase in Goods credits is still driven by the demand for ore and minerals. Australia's commodity (BBQ) economy continues the provide the fuel for growing economies. Almost all the goods were up by at least 12-15% over the last month's numbers suggesting that the numbers were adversely affected by the shorter February month.
If you review the pdf you can see the tables for the original, the deficit gets close to balance and then extends again. Civilian Aircraft makes an important impact of the debit side of the ledger, however fuel debits as up 43% over 8 months compared to last February.
No signing the K. Protocol means that Australia has not been penalised for having a comparative advantage in coal, given coal and coking coal is now by far our largest export.
The other stuff hasn't changed much as the US is still our biggest trading partner and the partner which we run the highest trade deficit with.
For all the worry and concern about China destroying our manufacturing, it is benign with the overall trend in our favour, our exports to China rose $AUD 3.6 Billion, 48% over 8 months (Feb 05 to Feb 06) whereas our imports only rose $AUD 1.9 Billion, 15% over the same period.
A quick check of the finances saw non-financial corporations borrowing heavily, this is most likely to pay for expansion to fuel the demand. An early signal that corporations feel they have expanded enough would be an continued fall in demand for imported capital goods, although the original series still shows that the continued demand for what would seem to be mining related equipment. This might also be seen in a reduction for corporate credit, although this could last longer as corporations use strong cash flows to borrow for merger and acquisition (M&A) to continue to expand.
Takeover fever really hasn't touched the booming mining industry as high prices allow companies to revisit unexplored leases both in Australia and elsewhere, rather than choosing to buy working mines to booster production whilst keeping costs at bay. M&A is happening in areas which are feeling the pinch on margins.
Mining costs will increase as lower yield ores are sourced to fill existing orders i.e. the mine life can be extended as most calculations are based on a specific price, increases in price make mining lower grade ores possible and if the company is smart it will mine those lower grade ores now at a higher cost whilst saving the lower cost, higher yielder ores for when prices drop. The aim afterall is extending the life and profit of the mine for as long as possible. This method is more longer term, ending up with more profit at the end, rather than having massive profits now at the expense of few or none later. I am sure there are companies doing both...
Have Fun
Previous articles:
Trade Deficit and Foreign Debt -1
Trade Deficit and Foreign Debt -2
Trade Deficit and Foreign Debt -3
Trade Deficit and Foreign Debt -4
Trade Deficit and Foreign Debt - 5
Related Articles:
Australian Trade Partners
Wednesday, April 26, 2006
Battlefield 2 fun #3
I made the 5000 point Staff Sergeant rank last week, I took the upgrade to the assault weapon. I have been playing around with a tool FRAPs which displays the framerate in the game and also can take movies as well. Every time I do something which would be interesting to see again it is too late unfortunately.
I have been disappointed in every upgraded weapon apart from the medic upgrade. The scope and semi-auto accuracy are very good. The rest of the weapons (apart from the sniper) seem less accurate, I still seem to shoot everything but too much in close combat even with the sights trained squarely on the enemy.
I would love EA to have kept data so that I could review my performance over time, I am definitely a better player now, but as the stats are still aggregate my earlier low scores still have too much significance of the points per minute and accuracy counters etc.
It is still great fun, even better to kill people who are clearly using some kind of dodgy aim device. I see it in every round with 5-10% of people immediately doing the dive/roll and shot on seeing you. More often then not that shot is a headshot. Are they just that good? From that distance? With that weapon? When I am moving at the same time?
As my muscle memory improves now I instinctively jump or dodge for cover (mostly in vain), sometimes I survive the first shot and kill them (much to their amazement I guess). I had one guy do the move 3 times and I still got him as I kept a building between me and his headshots.
If I feel I am getting better and the game is now almost 12 months old, does that mean the other players are also going to better rather than rookies?
Or have the best left for the next game and only the latecomers and therefore rookies are still left?
Have Fun
I have been disappointed in every upgraded weapon apart from the medic upgrade. The scope and semi-auto accuracy are very good. The rest of the weapons (apart from the sniper) seem less accurate, I still seem to shoot everything but too much in close combat even with the sights trained squarely on the enemy.
I would love EA to have kept data so that I could review my performance over time, I am definitely a better player now, but as the stats are still aggregate my earlier low scores still have too much significance of the points per minute and accuracy counters etc.
It is still great fun, even better to kill people who are clearly using some kind of dodgy aim device. I see it in every round with 5-10% of people immediately doing the dive/roll and shot on seeing you. More often then not that shot is a headshot. Are they just that good? From that distance? With that weapon? When I am moving at the same time?
As my muscle memory improves now I instinctively jump or dodge for cover (mostly in vain), sometimes I survive the first shot and kill them (much to their amazement I guess). I had one guy do the move 3 times and I still got him as I kept a building between me and his headshots.
If I feel I am getting better and the game is now almost 12 months old, does that mean the other players are also going to better rather than rookies?
Or have the best left for the next game and only the latecomers and therefore rookies are still left?
Have Fun
Subscribe to:
Posts (Atom)