Friday, December 2, 2016

Database Virtual Box Appliance / Virtual Machine - 12c for Practice


Database Virtual Box Appliance / Virtual Machine - 12c for Practice

Download it from here : click me

The pre-installed Oracle database 12c in Oracle Linux environment can be downloaded from the above link.

It will be very helpful for practicing 👍👍👍

Thursday, December 1, 2016

Oracle Tuning Approach

I see a very good picture detailing the steps to tune a sql and here is the picture for you.


Useful SQL Info - Part-2


  1. If we want to generate random numbers, random strings from Oracle (let say for the sake of testing etc...) then use DBMS_RANDOM package.

select dbms_random.value , dbms_random.value (1,100), trunc(dbms_random.value (1,100)), dbms_random.string ('A',4), dbms_random.string ('U',4), dbms_random.string ('L',4),
 dbms_random.string ('X',4), sysdate + trunc(dbms_random.value (1,100))
from dual
connect by level <=100



2. Materialize hint

The Oracle materialize hint is used to ensure that the Oracle cost-based optimizer materializes the temporary tables that are created inside the "WITH" clause.

MATERIALIZE hint in with clause converts result set into global temporary table
WITH data AS (
  SELECT /*+ MATERIALIZE */ level AS id
  FROM   dual
  CONNECT BY level <= 10000
)
SELECT rownum AS id
FROM   data, data, data
WHERE  rownum <= 1000000;






3.Dynamic Sampling & gather_plan_statistics Hint


CREATE TABLE CUST
(
  CUST_ID     NUMBER,
  LAST_NAME   VARCHAR2(30 BYTE),
  FIRST_NAME  VARCHAR2(30 BYTE)
)

Insert into CUST
   (CUST_ID, LAST_NAME, FIRST_NAME)
 Values
   (1028, 'RDRIZTFLNZWA', 'UPGGKGGDYF');
Insert into CUST
   (CUST_ID, LAST_NAME, FIRST_NAME)
 Values
   (1029, 'DQDQNU', 'TZSHGTTKUOV');
Insert into CUST
   (CUST_ID, LAST_NAME, FIRST_NAME)
 Values
   (1030, 'JZSOW', 'BMU');

commit;

If we do the explain plan for this below SQL

select * from cust

We notice below NOTE.

As stats were not gathered on CUST table, default dynamic sampling level=2 to used to calculate the stats on the fly..

Note
-----
   - dynamic sampling used for this statement (level=2)
   - Warning: basic plan statistics not available. These are only collected when:
       * hint 'gather_plan_statistics' is used for the statement or
       * parameter 'statistics_level' is set to 'ALL', at session or system level

Execute below to get rid of the above warning

First, gather table stats using dbms_stats package
Second,  use below commands to see expected & actual cardinality values.

alter session set statistics_level = all


select /*+ gather_plan_statistics  */  * from cust


4. Dynamic Sampling levels



Wednesday, November 30, 2016

Unnecessary Locking on Parent Table due to missing index on foreign key column in Child Table

If index not exists on foreign key column of a child table then it will create locks on the parent table when DML operation performed on child table.

Demonstration:

Create following sample tables. order_header is parent and order_detail is child table.


create table order_header (header_id number, customer varchar2(20))

alter table order_header add constraint pk_order_header  primary key (header_id)

create table order_detail (header_id number references order_header (header_id) on delete cascade , line_id number);

insert into order_header values (1,'a1');
insert into order_header values  (2,'a2');
insert into order_header values  (3,'a3');

insert into order_detail values  (1,1);
insert into order_detail values  (2,1);

commit;


Now, open two sessions to the database,

From session-1, delete a record from child table.
From session-2, delete a record from parent table. This transaction is going to be locked until the transaction in session-1 is either committed or rolled back.





To avoid this lock it is recommend to create a index on child table column.

create index order_detail_idx on order_detail (header_id);

Now, if we try the same delete statements, we wont see locking any more....





PIPELINED Function vs Normal Function Use Case in Oracle

Suppose we wanted to return values from Function in a nested table type format then pipelined table functions will give us better performance when compared with normal functions.

PGA memory consumption will be less in case of pipelined functions when compared it with normal functions


Example:
Create Object Type

create or replace type test_obj as object (id number, dsc varchar2(100));

Create Table Type

create or replace type test_obj_tab as table of test_obj;

Create Normal Function

CREATE OR REPLACE FUNCTION test_normal (p_val IN NUMBER)
   RETURN test_obj_tab
AS
   v_test_obj_tab   test_obj_tab := test_obj_tab ();
BEGIN
   FOR i IN 1 .. p_val
   LOOP
      v_test_obj_tab.EXTEND;
      v_test_obj_tab (v_test_obj_tab.LAST) := test_obj (i, 'Number-' || i);
   END LOOP;

   RETURN v_test_obj_tab;
EXCEPTION
   WHEN OTHERS
   THEN
      DBMS_OUTPUT.put_line (' error-  ' || SQLERRM);

END;


Create PIPELINED Function

CREATE OR REPLACE FUNCTION test_pipe (p_val IN NUMBER)
   RETURN test_obj_tab pipelined 
AS
   v_test_obj_tab   test_obj_tab := test_obj_tab ();
BEGIN
   FOR i IN 1 .. p_val
   LOOP
      pipe row (test_obj (i, 'Number-' || i));
   END LOOP;

EXCEPTION
   WHEN OTHERS
   THEN
      DBMS_OUTPUT.put_line (' error-  ' || SQLERRM);
END;


Testing

select * from
table(test_normal (100));

select * from
table (test_pipe(100));



PIPELINED Table Functions To load Data Files into Oracle Table

To load the data files (.csv) into Oracle Tables we have following options

1) Use SQLLOADER
2) Use External Table
3) Use UTL_FILE

The traditional or old approach is using SQLLOADER to perform the file loads.

External Table is the new concept and using pipelined table functions in combination with external tables will give huge performance benefit while loading huge data files.

Code is created to load the file using either of these methods and posted in here.

DOWNLOAD


Monday, November 21, 2016

Useful Oracle SQL Info

     1.Trim Example

SELECT TRIM (' 7'),
       TRIM (' 777 77  '),
       TRIM (' ' FROM '  7 7 '),
       TRIM (LEADING '0' FROM '000123'),
       TRIM (TRAILING '1' FROM 'Tech1'),
       TRIM (BOTH '1' FROM '123Tech111')

  FROM DUAL







        2. Sql to add seconds, minutes, hours to current date

SELECT SYSDATE,
       SYSDATE + (1 / (24 * 60 * 60)) "1Sec Addition",
       sysdate + interval '1' second,
       SYSDATE + (1 / (24 * 60)) "1Min Addition",
       sysdate + interval '1' minute,
       SYSDATE + (1 / 24) "1HR Addition",
       sysdate + interval '1' hour
  FROM DUAL


3.Converting string into rows

with t as (select 'abcd,123,defoifcd,87765' as str from dual)
    select level as n, regexp_substr(str,'[^,]+',1,level) as val
    from   t
   connect by regexp_substr(str,'[^,]+',1,level) is not null



    4.Converting string into columns

with t as (select 'abcd,123,defoifcd,87765' as str from dual)
    select max(decode(level,1,regexp_substr(str,'[^,]+',1,level))) as val1
          ,max(decode(level,2,regexp_substr(str,'[^,]+',1,level))) as val2
          ,max(decode(level,3,regexp_substr(str,'[^,]+',1,level))) as val3
          ,max(decode(level,4,regexp_substr(str,'[^,]+',1,level))) as val4
          ,max(decode(level,5,regexp_substr(str,'[^,]+',1,level))) as val5
   from   t
  connect by regexp_substr(str,'[^,]+',1,level) is not null


    5.Listagg Test

create table dept_test
(
  dno       number,
  dname     varchar2(100 byte),
  location  varchar2(100 byte)
)

set define off;
insert into dept_test   (dno, dname, location) values   (1, 'Accounts', 'NY');
insert into dept_test   (dno, dname, location) values   (2, 'Delivery', 'NJ');
insert into dept_test   (dno, dname, location) values   (3, 'Customer Support', 'NJ');
insert into dept_test   (dno, dname, location) values   (4, 'Billing', 'NY');
insert into dept_test   (dno, dname, location) values   (5, 'Payments', 'NY');
commit;


select * from  dept_test
select location, listagg (dname,',') within group (order by dname) dept_list
from  dept_test
group by location


    6. Notepad++ add quotes at begin & end of each line in a file



7.Convert String with comma separated values into Columns Using SQL
1.       select * from  test
select  c1, trim(regexp_substr(c2,'[^,]+',1,level)) c2
from   test
connect by regexp_substr(c2,'[^,]+',1,level) is not null
and c2 = prior c2
and c1 = prior c1
and prior sys_guid() is not null

order by c1