Possible Y2K trap in using DECODE?

Possible Y2K trap in using DECODE?

Post by Jurij Mod » Sat, 09 Jan 1999 04:00:00



Today I came accross a very interesting situation which I think could
be a potential Y2K problem.

Here is a scenario:
In one of our projects, we recive a source data from a mainframe, load
it with SQL*Loader to temporary tables and from there on we transform
it according to the required business rules to the destination tables
of the data warehouse. Pretty standard, I guess. There are a couple of
date fields in the source files and they are recorded as 8 character
strings, in the format 'DDMMYYYY'. Saying this, one could conclude
there is no actual danger about Y2K as the year part of the dates is
passed with the century part, not only last two years of the century.
But there is a catch. One of the business rules says:

"Dates are passed as character strings in the format 'DDMMYYYY' and
should be stored as DATE datatype in the Oracle database. Hovewer, if
the date was unknown, it was stored as '00000000' or '01010001' in a
legacy system and in these cases dates should be stored as NULL values
in the Oracle database".

To fulfill this rule, we used the DECODE function in the insert
statements of our stored procedures, something like:

INSERT INTO destination_table (date_column)
  SELECT DECODE(char_date, '00000000', NULL, '01010001', NULL,
                TO_DATE(char_date,'DDMMYYYY'))
  FROM temp_table;

At first glimpse it all seems to be as it should be, no potential Y2K
bug possible as there is explicite char to date conversion with
TO_DATE function using format mask with 4digit years. But guess what,
it is very high possibility that dates from 21st century will be
stored as dates in the 20th century!

Here is a quick demonstration:

SQL> SELECT sysdate FROM dual;

SYSDATE
--------
08.01.99

SQL> CREATE TABLE destination_table (date_column DATE);

Table created.

SQL> INSERT INTO destination_table (date_column)
  2    SELECT DECODE('31.12.2000', '00000000', NULL,
  3                   TO_DATE('31.12.2000','DD.MM.YYYY'))
  4    FROM DUAL;

1 row created.

SQL> SELECT TO_CHAR(date_column,'DD.MM.YYYY') FROM destination_table;

TO_CHAR(DATE_COLUMN,'DD.MM.YYYY')
----------------------------------------------------------------------
31.12.1900

The date_column should contain a date in a year 2000, not 1900!!!!
There is absolutely nothing wrong with the TO_DATE function or the
format mask. Actualy, even if we directly use a DATE datatype as a
condition-satisfying exit of the DECODE function the dates from past
the year 2000 will be stored as the years in the 20th century.
Example:

SQL> TRUNCATE TABLE destination_table;

Table truncated.

SQL> SELECT TO_CHAR(sysdate+1000,'DD.MM.YYYY') FROM dual;

TO_CHAR(SYSDATE+1000,'DD.MM.YYYY')
----------------------------------------------------------------------
04.10.2001

SQL> INSERT INTO destination_table (date_column)
  2    SELECT DECODE('01.11.1999', '00000000', NULL, sysdate+1000)
  3    FROM DUAL;

1 row created.

SQL> SELECT TO_CHAR(date_column,'DD.MM.YYYY') FROM destination_table;

TO_CHAR(DATE_COLUMN,'DD.MM.YYYY')
----------------------------------------------------------------------
04.10.1901

Instead of year 2001 the date was stored as in year 1901 !!!

This is all because we used a DECODE function. And it can't even be
considered as an Oracle bug, because this particular behaviour of the
decode function is described in the docs, but I belive most people
didn't detect this behaviour as a potential Y2K bomb. So what is it
about this unusual behaviour of the DECODE? Here is a quote from the
SQL manual:

"Oracle7 automatically converts the return value to the same datatype
as the first result. If the first result has the datatype CHAR or if
the first result is null, then Oracle7 converts the return value to
the datatype VARCHAR2."

In our case, the first result in the decode was a NULL, so the decode
implicitly converted the regular DATE value to a VARCHAR2 string using
the default format mask (with only 2 digits for a year part), hence
loosing  an information abot the century. When inserting the record
into a table, it again implicitly converted the string into a date,
but the century information was allready lost in a previous
conversion, so it used a current century for conversion. There is no
programatic cure for this behaviour, no additional TO_DATE or TO_CHAR
functions inside or outside of the DECODE can help here, no 'RR' or
'RRRR' format masks can help. The only way for handling this problem
is to use 4digit year format as a default year format in a session
(either with setting NLS_DATE_FORMAT in a client side environment or
with ALTER SESSION SET NLS_DATE_FORMAT at the beginning of the
session).

Up until now I was under an impression that if I handle all
character-to-date conversions using explicit TO_DATE function with a
'YYYY' in a format mask, there is no possibility for a Y2K troubles.
Now i'm convinced that this aproach just isn't 100% safe. So I guess
it's time to check how we handled the dates when using the DECODE
function in all of our SQL/PLSQL programs....

Any comments?


Certified Oracle7 DBA (OCP)
================================================
The above opinions are mine and do not represent
any official standpoints of my employer

 
 
 

Possible Y2K trap in using DECODE?

Post by Jurij Mod » Sat, 09 Jan 1999 04:00:00


Today I came accross a very interesting situation which I think could
be a potential Y2K problem.

Here is a scenario:
In one of our projects, we recive a source data from a mainframe, load
it with SQL*Loader to temporary tables and from there on we transform
it according to the required business rules to the destination tables
of the data warehouse. Pretty standard, I guess. There are a couple of
date fields in the source files and they are recorded as 8 character
strings, in the format 'DDMMYYYY'. Saying this, one could conclude
there is no actual danger about Y2K as the year part of the dates is
passed with the century part, not only last two years of the century.
But there is a catch. One of the business rules says:

"Dates are passed as character strings in the format 'DDMMYYYY' and
should be stored as DATE datatype in the Oracle database. Hovewer, if
the date was unknown, it was stored as '00000000' or '01010001' in a
legacy system and in these cases dates should be stored as NULL values
in the Oracle database".

To fulfill this rule, we used the DECODE function in the insert
statements of our stored procedures, something like:

INSERT INTO destination_table (date_column)
  SELECT DECODE(char_date, '00000000', NULL, '01010001', NULL,
                TO_DATE(char_date,'DDMMYYYY'))
  FROM temp_table;

At first glimpse it all seems to be as it should be, no potential Y2K
bug possible as there is explicite char to date conversion with
TO_DATE function using format mask with 4digit years. But guess what,
it is very high possibility that dates from 21st century will be
stored as dates in the 20th century!

Here is a quick demonstration:

SQL> SELECT sysdate FROM dual;

SYSDATE
--------
08.01.99

SQL> CREATE TABLE destination_table (date_column DATE);

Table created.

SQL> INSERT INTO destination_table (date_column)
  2    SELECT DECODE('31.12.2000', '00000000', NULL,
  3                   TO_DATE('31.12.2000','DD.MM.YYYY'))
  4    FROM DUAL;

1 row created.

SQL> SELECT TO_CHAR(date_column,'DD.MM.YYYY') FROM destination_table;

TO_CHAR(DATE_COLUMN,'DD.MM.YYYY')
----------------------------------------------------------------------
31.12.1900

The date_column should contain a date in a year 2000, not 1900!!!!
There is absolutely nothing wrong with the TO_DATE function or the
format mask. Actualy, even if we directly use a DATE datatype as a
condition-satisfying exit of the DECODE function the dates from past
the year 2000 will be stored as the years in the 20th century.
Example:

SQL> TRUNCATE TABLE destination_table;

Table truncated.

SQL> SELECT TO_CHAR(sysdate+1000,'DD.MM.YYYY') FROM dual;

TO_CHAR(SYSDATE+1000,'DD.MM.YYYY')
----------------------------------------------------------------------
04.10.2001

SQL> INSERT INTO destination_table (date_column)
  2    SELECT DECODE('01.11.1999', '00000000', NULL, sysdate+1000)
  3    FROM DUAL;

1 row created.

SQL> SELECT TO_CHAR(date_column,'DD.MM.YYYY') FROM destination_table;

TO_CHAR(DATE_COLUMN,'DD.MM.YYYY')
----------------------------------------------------------------------
04.10.1901

Instead of year 2001 the date was stored as in year 1901 !!!

This is all because we used a DECODE function. And it can't even be
considered as an Oracle bug, because this particular behaviour of the
decode function is described in the docs, but I belive most people
didn't detect this behaviour as a potential Y2K bomb. So what is it
about this unusual behaviour of the DECODE? Here is a quote from the
SQL manual:

"Oracle7 automatically converts the return value to the same datatype
as the first result. If the first result has the datatype CHAR or if
the first result is null, then Oracle7 converts the return value to
the datatype VARCHAR2."

In our case, the first result in the decode was a NULL, so the decode
implicitly converted the regular DATE value to a VARCHAR2 string using
the default format mask (with only 2 digits for a year part), hence
loosing  an information abot the century. When inserting the record
into a table, it again implicitly converted the string into a date,
but the century information was allready lost in a previous
conversion, so it used a current century for conversion. There is no
programatic cure for this behaviour, no additional TO_DATE or TO_CHAR
functions inside or outside of the DECODE can help here, no 'RR' or
'RRRR' format masks can help. The only way for handling this problem
is to use 4digit year format as a default year format in a session
(either with setting NLS_DATE_FORMAT in a client side environment or
with ALTER SESSION SET NLS_DATE_FORMAT at the beginning of the
session).

Up until now I was under an impression that if I handle all
character-to-date conversions using explicit TO_DATE function with a
'YYYY' in a format mask, there is no possibility for a Y2K troubles.
Now i'm convinced that this aproach just isn't 100% safe. So I guess
it's time to check how we handled the dates when using the DECODE
function in all of our SQL/PLSQL programs....

Any comments?


Certified Oracle7 DBA (OCP)
================================================
The above opinions are mine and do not represent
any official standpoints of my employer

 
 
 

Possible Y2K trap in using DECODE?

Post by Erwin Dondor » Mon, 11 Jan 1999 04:00:00


Jurij,

Thanks for telling us about this subject (and explaining it!)
I think you can solve your problem by recoding your statement to:

INSERT INTO destination_table (date_column)
  SELECT DECODE(char_date, '00000000', TO_DATE(NULL), '01010001', NULL,
                TO_DATE(char_date,'DDMMYYYY'))
  FROM temp_table;

The only thing that has changed is the TO_DATE on the first result in the
DECODE.
This is possible because even the NULL value has a type.

    Erwin Dondorp
    http://www.wxs.nl/~erwindon


> Today I came accross a very interesting situation which I think could
> be a potential Y2K problem.

> Here is a scenario:
> In one of our projects, we recive a source data from a mainframe, load
> it with SQL*Loader to temporary tables and from there on we transform
> it according to the required business rules to the destination tables
> of the data warehouse. Pretty standard, I guess. There are a couple of
> date fields in the source files and they are recorded as 8 character
> strings, in the format 'DDMMYYYY'. Saying this, one could conclude
> there is no actual danger about Y2K as the year part of the dates is
> passed with the century part, not only last two years of the century.
> But there is a catch. One of the business rules says:

> "Dates are passed as character strings in the format 'DDMMYYYY' and
> should be stored as DATE datatype in the Oracle database. Hovewer, if
> the date was unknown, it was stored as '00000000' or '01010001' in a
> legacy system and in these cases dates should be stored as NULL values
> in the Oracle database".

> To fulfill this rule, we used the DECODE function in the insert
> statements of our stored procedures, something like:

> INSERT INTO destination_table (date_column)
>   SELECT DECODE(char_date, '00000000', NULL, '01010001', NULL,
>                 TO_DATE(char_date,'DDMMYYYY'))
>   FROM temp_table;

> At first glimpse it all seems to be as it should be, no potential Y2K
> bug possible as there is explicite char to date conversion with
> TO_DATE function using format mask with 4digit years. But guess what,
> it is very high possibility that dates from 21st century will be
> stored as dates in the 20th century!

> Here is a quick demonstration:

> SQL> SELECT sysdate FROM dual;

> SYSDATE
> --------
> 08.01.99

> SQL> CREATE TABLE destination_table (date_column DATE);

> Table created.

> SQL> INSERT INTO destination_table (date_column)
>   2    SELECT DECODE('31.12.2000', '00000000', NULL,
>   3                   TO_DATE('31.12.2000','DD.MM.YYYY'))
>   4    FROM DUAL;

> 1 row created.

> SQL> SELECT TO_CHAR(date_column,'DD.MM.YYYY') FROM destination_table;

> TO_CHAR(DATE_COLUMN,'DD.MM.YYYY')
> ----------------------------------------------------------------------
> 31.12.1900

> The date_column should contain a date in a year 2000, not 1900!!!!
> There is absolutely nothing wrong with the TO_DATE function or the
> format mask. Actualy, even if we directly use a DATE datatype as a
> condition-satisfying exit of the DECODE function the dates from past
> the year 2000 will be stored as the years in the 20th century.
> Example:

> SQL> TRUNCATE TABLE destination_table;

> Table truncated.

> SQL> SELECT TO_CHAR(sysdate+1000,'DD.MM.YYYY') FROM dual;

> TO_CHAR(SYSDATE+1000,'DD.MM.YYYY')
> ----------------------------------------------------------------------
> 04.10.2001

> SQL> INSERT INTO destination_table (date_column)
>   2    SELECT DECODE('01.11.1999', '00000000', NULL, sysdate+1000)
>   3    FROM DUAL;

> 1 row created.

> SQL> SELECT TO_CHAR(date_column,'DD.MM.YYYY') FROM destination_table;

> TO_CHAR(DATE_COLUMN,'DD.MM.YYYY')
> ----------------------------------------------------------------------
> 04.10.1901

> Instead of year 2001 the date was stored as in year 1901 !!!

> This is all because we used a DECODE function. And it can't even be
> considered as an Oracle bug, because this particular behaviour of the
> decode function is described in the docs, but I belive most people
> didn't detect this behaviour as a potential Y2K bomb. So what is it
> about this unusual behaviour of the DECODE? Here is a quote from the
> SQL manual:

> "Oracle7 automatically converts the return value to the same datatype
> as the first result. If the first result has the datatype CHAR or if
> the first result is null, then Oracle7 converts the return value to
> the datatype VARCHAR2."

> In our case, the first result in the decode was a NULL, so the decode
> implicitly converted the regular DATE value to a VARCHAR2 string using
> the default format mask (with only 2 digits for a year part), hence
> loosing  an information abot the century. When inserting the record
> into a table, it again implicitly converted the string into a date,
> but the century information was allready lost in a previous
> conversion, so it used a current century for conversion. There is no
> programatic cure for this behaviour, no additional TO_DATE or TO_CHAR
> functions inside or outside of the DECODE can help here, no 'RR' or
> 'RRRR' format masks can help. The only way for handling this problem
> is to use 4digit year format as a default year format in a session
> (either with setting NLS_DATE_FORMAT in a client side environment or
> with ALTER SESSION SET NLS_DATE_FORMAT at the beginning of the
> session).

> Up until now I was under an impression that if I handle all
> character-to-date conversions using explicit TO_DATE function with a
> 'YYYY' in a format mask, there is no possibility for a Y2K troubles.
> Now i'm convinced that this aproach just isn't 100% safe. So I guess
> it's time to check how we handled the dates when using the DECODE
> function in all of our SQL/PLSQL programs....

> Any comments?


> Certified Oracle7 DBA (OCP)
> ================================================
> The above opinions are mine and do not represent
> any official standpoints of my employer

 
 
 

Possible Y2K trap in using DECODE?

Post by Jurij Mod » Mon, 11 Jan 1999 04:00:00




Quote:>Jurij,

>Thanks for telling us about this subject (and explaining it!)
>I think you can solve your problem by recoding your statement to:

>INSERT INTO destination_table (date_column)
>  SELECT DECODE(char_date, '00000000', TO_DATE(NULL), '01010001', NULL,
>                TO_DATE(char_date,'DDMMYYYY'))
>  FROM temp_table;

>The only thing that has changed is the TO_DATE on the first result in the
>DECODE.
>This is possible because even the NULL value has a type.

Erwin,

Thanks for the suggestion. Unfortunatelly this doesn't help (btw, this
was the first thing I tried when discovered this behaviour) - although
after this modification the first NULL result is of type DATE, the
resulted value is still NULL and DECODE will transform the result to
VARCHAR2 type regardless of the type of the first NULL result!

The only workaround for this problem I could found in the meantime was
to force the first result in the DECODE to be non-null date value and
to make sure this first condition never evaluates to TRUE. In my case
for example I know that in a source file all date values can contain
only numerical characters, so I can use the following DECODE to be
sure the resulted datatype will be of type DATE:

INSERT INTO destination_table (date_column)
  SELECT DECODE(char_date, 'xyzabc_-', SYSDATE,
                           '00000000', NULL,
                           '01010001', NULL,
                TO_DATE(char_date,'DDMMYYYY'))
  FROM temp_table;

Because I know the char_date will never be of value 'xyzabc_-' I can
be sure the DECODE function will never return the SYSDATE, and because
the SYSDATE can't be NULL there will be no implicit char-to-date
conversion inside the DECODE.

Quote:>    Erwin Dondorp
>    http://www.wxs.nl/~erwindon

Regards,


Certified Oracle7 DBA (OCP)
================================================
The above opinions are mine and do not represent
any official standpoints of my employer

 
 
 

Possible Y2K trap in using DECODE?

Post by Connor McDonal » Tue, 12 Jan 1999 04:00:00



> Today I came accross a very interesting situation which I think could
> be a potential Y2K problem.

> Here is a scenario:
> In one of our projects, we recive a source data from a mainframe, load
> it with SQL*Loader to temporary tables and from there on we transform
> it according to the required business rules to the destination tables
> of the data warehouse. Pretty standard, I guess. There are a couple of
> date fields in the source files and they are recorded as 8 character
> strings, in the format 'DDMMYYYY'. Saying this, one could conclude
> there is no actual danger about Y2K as the year part of the dates is
> passed with the century part, not only last two years of the century.
> But there is a catch. One of the business rules says:

> "Dates are passed as character strings in the format 'DDMMYYYY' and
> should be stored as DATE datatype in the Oracle database. Hovewer, if
> the date was unknown, it was stored as '00000000' or '01010001' in a
> legacy system and in these cases dates should be stored as NULL values
> in the Oracle database".

> To fulfill this rule, we used the DECODE function in the insert
> statements of our stored procedures, something like:

> INSERT INTO destination_table (date_column)
>   SELECT DECODE(char_date, '00000000', NULL, '01010001', NULL,
>                 TO_DATE(char_date,'DDMMYYYY'))
>   FROM temp_table;

> At first glimpse it all seems to be as it should be, no potential Y2K
> bug possible as there is explicite char to date conversion with
> TO_DATE function using format mask with 4digit years. But guess what,
> it is very high possibility that dates from 21st century will be
> stored as dates in the 20th century!

> Here is a quick demonstration:

> SQL> SELECT sysdate FROM dual;

> SYSDATE
> --------
> 08.01.99

> SQL> CREATE TABLE destination_table (date_column DATE);

> Table created.

> SQL> INSERT INTO destination_table (date_column)
>   2    SELECT DECODE('31.12.2000', '00000000', NULL,
>   3                   TO_DATE('31.12.2000','DD.MM.YYYY'))
>   4    FROM DUAL;

> 1 row created.

> SQL> SELECT TO_CHAR(date_column,'DD.MM.YYYY') FROM destination_table;

> TO_CHAR(DATE_COLUMN,'DD.MM.YYYY')
> ----------------------------------------------------------------------
> 31.12.1900

> The date_column should contain a date in a year 2000, not 1900!!!!
> There is absolutely nothing wrong with the TO_DATE function or the
> format mask. Actualy, even if we directly use a DATE datatype as a
> condition-satisfying exit of the DECODE function the dates from past
> the year 2000 will be stored as the years in the 20th century.
> Example:

> SQL> TRUNCATE TABLE destination_table;

> Table truncated.

> SQL> SELECT TO_CHAR(sysdate+1000,'DD.MM.YYYY') FROM dual;

> TO_CHAR(SYSDATE+1000,'DD.MM.YYYY')
> ----------------------------------------------------------------------
> 04.10.2001

> SQL> INSERT INTO destination_table (date_column)
>   2    SELECT DECODE('01.11.1999', '00000000', NULL, sysdate+1000)
>   3    FROM DUAL;

> 1 row created.

> SQL> SELECT TO_CHAR(date_column,'DD.MM.YYYY') FROM destination_table;

> TO_CHAR(DATE_COLUMN,'DD.MM.YYYY')
> ----------------------------------------------------------------------
> 04.10.1901

> Instead of year 2001 the date was stored as in year 1901 !!!

> This is all because we used a DECODE function. And it can't even be
> considered as an Oracle bug, because this particular behaviour of the
> decode function is described in the docs, but I belive most people
> didn't detect this behaviour as a potential Y2K bomb. So what is it
> about this unusual behaviour of the DECODE? Here is a quote from the
> SQL manual:

> "Oracle7 automatically converts the return value to the same datatype
> as the first result. If the first result has the datatype CHAR or if
> the first result is null, then Oracle7 converts the return value to
> the datatype VARCHAR2."

> In our case, the first result in the decode was a NULL, so the decode
> implicitly converted the regular DATE value to a VARCHAR2 string using
> the default format mask (with only 2 digits for a year part), hence
> loosing  an information abot the century. When inserting the record
> into a table, it again implicitly converted the string into a date,
> but the century information was allready lost in a previous
> conversion, so it used a current century for conversion. There is no
> programatic cure for this behaviour, no additional TO_DATE or TO_CHAR
> functions inside or outside of the DECODE can help here, no 'RR' or
> 'RRRR' format masks can help. The only way for handling this problem
> is to use 4digit year format as a default year format in a session
> (either with setting NLS_DATE_FORMAT in a client side environment or
> with ALTER SESSION SET NLS_DATE_FORMAT at the beginning of the
> session).

> Up until now I was under an impression that if I handle all
> character-to-date conversions using explicit TO_DATE function with a
> 'YYYY' in a format mask, there is no possibility for a Y2K troubles.
> Now i'm convinced that this aproach just isn't 100% safe. So I guess
> it's time to check how we handled the dates when using the DECODE
> function in all of our SQL/PLSQL programs....

> Any comments?


> Certified Oracle7 DBA (OCP)
> ================================================
> The above opinions are mine and do not represent
> any official standpoints of my employer

Even worse - for the same reasoning, you can sometimes lose the time
portion of a date...

Cheers
--
==============================================
Connor McDonald
BHP Information Technology
Perth, Western Australia
"The difference between me and a madman is that I am not mad"
                                - Salvador Dali

 
 
 

Possible Y2K trap in using DECODE?

Post by Gerard M. Averi » Tue, 12 Jan 1999 04:00:00




snip

Quote:>The only workaround for this problem I could found in the meantime was
>to force the first result in the DECODE to be non-null date value and
>to make sure this first condition never evaluates to TRUE. In my case
>for example I know that in a source file all date values can contain
>only numerical characters, so I can use the following DECODE to be
>sure the resulted datatype will be of type DATE:

>INSERT INTO destination_table (date_column)
>  SELECT DECODE(char_date, 'xyzabc_-', SYSDATE,
>                           '00000000', NULL,
>                           '01010001', NULL,
>                TO_DATE(char_date,'DDMMYYYY'))
>  FROM temp_table;

Like you said earlier, your problem comes from the implicit converts from date
to varchar (and vice versa) using the default date format.  You could change
the default date format, but since it can be changed by each user and thus
override your system setting this wouldn't guarantee a solution (though it
could be acceptable).  The other solution, which doesn't rely on a "magic"
value such as 'xyzab_-', is to make sure there are no implicit conversions:

INSERT INTO destination_table (date_column)
  SELECT to_date(
           DECODE(char_date, '00000000', NULL, '01010001', NULL, char_date)
         , 'DDMMYYYY')
  FROM temp_table;

I've had the default date format bite me, too, and have learned to avoid
relying on it.  Hope this helps.

Gerard

----
Gerard M. Averill, Researcher
CHSRA, University of Wisconsin - Madison
GAverill<at>chsra<dot>wisc<dot>edu

 
 
 

Possible Y2K trap in using DECODE?

Post by Jurij Mod » Tue, 12 Jan 1999 04:00:00






>snip

>INSERT INTO destination_table (date_column)
>  SELECT to_date(
>           DECODE(char_date, '00000000', NULL, '01010001', NULL, char_date)
>         , 'DDMMYYYY')
>  FROM temp_table;

>I've had the default date format bite me, too, and have learned to avoid
>relying on it.  Hope this helps.

Gerard,

Your sollution is by far the easiest and the best for this case.

Thanks!

Quote:>Gerard

>----
>Gerard M. Averill, Researcher
>CHSRA, University of Wisconsin - Madison
>GAverill<at>chsra<dot>wisc<dot>edu


Certified Oracle7 DBA (OCP)
================================================
The above opinions are mine and do not represent
any official standpoints of my employer
 
 
 

Possible Y2K trap in using DECODE?

Post by Shawn Ramse » Wed, 13 Jan 1999 04:00:00


Snip

Quote:>Like you said earlier, your problem comes from the implicit converts from
date
>to varchar (and vice versa) using the default date format.  You could
change
>the default date format, but since it can be changed by each user and thus
>override your system setting this wouldn't guarantee a solution (though it
>could be acceptable).  The other solution, which doesn't rely on a "magic"
>value such as 'xyzab_-', is to make sure there are no implicit conversions:

Why not just explicitly set the date format in the sql*plus session?  Just
do an ALTER SESSION SET NLS_DATE_FORMAT 'DDMMYYYY' then there will be no
question on the conversion.
 
 
 

Possible Y2K trap in using DECODE?

Post by Connor McDonal » Thu, 14 Jan 1999 04:00:00



> Snip

> >Like you said earlier, your problem comes from the implicit converts from
> date
> >to varchar (and vice versa) using the default date format.  You could
> change
> >the default date format, but since it can be changed by each user and thus
> >override your system setting this wouldn't guarantee a solution (though it
> >could be acceptable).  The other solution, which doesn't rely on a "magic"
> >value such as 'xyzab_-', is to make sure there are no implicit conversions:

> Why not just explicitly set the date format in the sql*plus session?  Just
> do an ALTER SESSION SET NLS_DATE_FORMAT 'DDMMYYYY' then there will be no
> question on the conversion.

Easy in your SQL Plus session - not so easy if you have the same code in
500 or so application programs...  :)

Cheers
--
==============================================
Connor McDonald
BHP Information Technology
Perth, Western Australia
"The difference between me and a madman is that I am not mad"
                                - Salvador Dali

 
 
 

Possible Y2K trap in using DECODE?

Post by Grinberg L » Fri, 15 Jan 1999 04:00:00



>> SQL> INSERT INTO destination_table (date_column)
>>   2    SELECT DECODE('31.12.2000', '00000000', NULL,
>>   3                   TO_DATE('31.12.2000','DD.MM.YYYY'))
>>   4    FROM DUAL;

>> 1 row created.

>> SQL> SELECT TO_CHAR(date_column,'DD.MM.YYYY') FROM destination_table;

>> TO_CHAR(DATE_COLUMN,'DD.MM.YYYY')
>> ----------------------------------------------------------------------
>> 31.12.1900

I agree that it is unexpected but documented result.

But you can try

 INSERT INTO destination_table (date_column)
    SELECT DECODE('31.12.2000', '00000000', to_date(NULL),  -- !!!
   TO_DATE('31.12.2000','DD.MM.YYYY'))
   FROM DUAL;

and you will get true datest.

Good luck,


 
 
 

Possible Y2K trap in using DECODE?

Post by Jurij Mod » Fri, 15 Jan 1999 04:00:00


On Thu, 14 Jan 1999 11:36:33 +0200, "Grinberg L."



>I agree that it is unexpected but documented result.

>But you can try

> INSERT INTO destination_table (date_column)
>    SELECT DECODE('31.12.2000', '00000000', to_date(NULL),  -- !!!
>   TO_DATE('31.12.2000','DD.MM.YYYY'))
>   FROM DUAL;

>and you will get true datest.

>Good luck,



Leon,

I could swear this was the firs thing I tried when I discovered this
DECODE behaviour and I could swear the results convinced me that this
was not a sollution - I've also written that in my reply to Erwin

see that this realy is a cure - I obviously made a mistake in my first
probe.

So I stand corrected - the "TO_DATE(NULL)" realy solves this DECODE
problem.

Thanks,

Certified Oracle7 DBA (OCP)
================================================
The above opinions are mine and do not represent
any official standpoints of my employer

 
 
 

Possible Y2K trap in using DECODE?

Post by Jurij Mod » Fri, 15 Jan 1999 04:00:00


On Thu, 14 Jan 1999 11:36:33 +0200, "Grinberg L."



>I agree that it is unexpected but documented result.

>But you can try

> INSERT INTO destination_table (date_column)
>    SELECT DECODE('31.12.2000', '00000000', to_date(NULL),  -- !!!
>   TO_DATE('31.12.2000','DD.MM.YYYY'))
>   FROM DUAL;

>and you will get true datest.

>Good luck,



Leon,

I could swear this was the firs thing I tried when I discovered this
DECODE behaviour and I could swear the results convinced me that this
was not a sollution - I've also written that in my reply to Erwin

see that this realy is a cure - I obviously made a mistake in my first
probe.

So I stand corrected - the "TO_DATE(NULL)" realy solves this DECODE
problem.

Thanks,

Certified Oracle7 DBA (OCP)
================================================
The above opinions are mine and do not represent
any official standpoints of my employer

 
 
 

1. Using Ranges in DECODE function -- Possible???

is it possible to utilize a range of values in the Search fields?

 example:

 decode(tenure,between 0 and 30, 1, 0)

 seems like it ought to work, but everytime i try this syntax SQL Plus
says "Missing Expression"

 if i try writing it like such:

 decode(tenure,tenure between 0 and 30, 1, 0)

 i'm told "missing right parenthesis" starting at the between

 i can't believe that i would have to write the query like this:

 decode(tenure,0,1,
               1,1,
               2,1,
               3,1,
               etc,etc from 4 through 30
                 0)

 do I???

 thanks in advance

 stuart

Sent via Deja.com http://www.deja.com/
Before you buy.

2. Maximize QA window with keyboard

3. Error Trapping - Can you trap a DTS Package error using SQLDMO running a job

4. DBGrid Components

5. Is Error Trapping/Branching possible within TRANSACT SQL

6. SQL7 and SQL2000 together

7. Trapping errors when doing DDL, possible?

8. ORACLE 8 ERROR LOGGING: PLEASE HELP ME

9. How to SQLServer Decode Value as DECODE Oracle...

10. Trapping errors in DDL, is it possible?

11. decode or not decode?

12. !y2k: Possible issue with crystal reports in vb5

13. y2k possible problems