Sunday, September 16, 2012

Copy maven dependencies to WEB-INF/lib

Use Case

Configure Dynamic Web Project in Eclipse to copy maven dependencies (jars) to WEB-INF/lib.

Solution

1. Update eclipse.ini, specify -vm option

-vm
C:\Program Files\Java\jdk1.6.0_25\bin\

2. Convert project to MAVEN backed, set packaging to WAR.

3. Configure JRE
Go to Project > Properties > Java Build Path > Libraries. Choose JRE System Library and click 'Edit'. Choose Execution Environment from list of available environments, use the same vm as eclipse.ini has.

4. Configure Deployment Assembly
Go to Project > Properties > Deployment Assembly. Click Add button, then choose Java Build Path Entries. Choose Maven Dependencies from the list and click OK. 


Monday, October 31, 2011

ActiveMQ 5.3 / System.IO.EndOfStreamException / "Unable to read beyond the end of the stream.

Issue

I've got a problem with long-running message listeners in ActiveMQ. After approximately 60s, long-running operation breaks with exception - Thread was aborted.

Solution

Solution and reason are described there: https://issues.apache.org/jira/browse/AMQNET-196

After adding: ?transport.keepAliveResponseRequired=true to connection string everything became working like a charm.

Monday, July 11, 2011

HTTP Error 404.15 – Not Found: The request filtering module is configured to deny a request where the query string is too long

Use Case

Query string exceeds reasonable size for HTTP GET.

Solution

Quick and dirty solution is to increase the maximum querystring size by setting the maxQueryString attribute on the requestLimitselement in the system.webServer/security/requestFiltering configuration section in your application’s web.config:

<system.webServer>
    <security>
        <requestFiltering>
            <requestLimits maxQueryString="NEW_VALUE_IN_BYTES" />
        </requestFiltering>
    </security>
</system.webServer>

http://mvolo.com/blogs/serverside/archive/2007/12/08/IIS-7.0-Breaking-Changes-ASP.NET-2.0-applications-Integrated-mode.aspx

Or use HTTP POST, if you can.

Drop all user objects in the schema

Use Case

Drop objects in schema without deletion of user

Solution

Very helpful script can be found at: http://stackoverflow.com/questions/842530/plsql-drop-all-database-objects-of-a-user



create or replace
FUNCTION                DROP_ALL_SCHEMA_OBJECTS RETURN NUMBER AS
PRAGMA AUTONOMOUS_TRANSACTION;
cursor c_get_objects is
  select object_type,'"'||object_name||'"'||decode(object_type,'TABLE' ,' cascade constraints',null) obj_name
  from user_objects
  where object_type in ('TABLE','VIEW','PACKAGE','SEQUENCE','SYNONYM', 'MATERIALIZED VIEW','FUNCTION','PROCEDURE') AND object_name <> 'DROP_ALL_SCHEMA_OBJECTS'
  order by object_type;
cursor c_get_objects_type is
  select object_type, '"'||object_name||'"' obj_name
  from user_objects
  where object_type in ('TYPE');
BEGIN
  begin
    for object_rec in c_get_objects loop
      execute immediate ('drop '||object_rec.object_type||' ' ||object_rec.obj_name);
    end loop;
    for object_rec in c_get_objects_type loop
      begin
        execute immediate ('drop '||object_rec.object_type||' ' ||object_rec.obj_name);
      end;
    end loop;
  end;
  RETURN 0;
END DROP_ALL_SCHEMA_OBJECTS;


Usage is easy:


select DROP_ALL_SCHEMA_OBJECTS from dual;

Wednesday, June 29, 2011

ext.net checkbox in grid header

Use Case

Place checkbox (or virtual any custom control to the header)

Solution

Use AfterRender event to override header cell's text.
Inspired by http://extjs-world.blogspot.com/2011/06/how-to-show-image-or-checkbox-html.html



                function afterRender(_grid) {
                     var columns = _grid.getColumnModel().config;
                     var view = _grid.getView();
                     var columnIndex = 0;
                     Ext.each(columns, function(column){
                        if (column.id.indexOf(""IDX"") == 0) {
                            var headerHtml = '<div class=""x-grid""' + columnIndex + '-hd-inner x-grid' + columnIndex + '-hd-' + column.id + '"" unselectable=""on"" style=""text-align: center;"">' + column.header;
                            headerHtml += '&nbsp;<input type=""checkbox"" id=""'+column.id+'_CHCK"">';
                            headerHtml += '</div>';
                            //replace the view headerCell 
                            view.getHeaderCell(columnIndex).innerHTML = headerHtml;
                        }
                        columnIndex = columnIndex + 1;                        
                     });
                };
                ");


Thursday, June 23, 2011

oracle export import shell scripts

Use Case

Need to  export / import oracle schema from command line (linux shell)

Solution

Great scripts could be found at: http://noerror.blogspot.com/2005/01/expimp-korn-shell-scripts.html
However, on my environment they could not be executed due to very minor syntax issues which could be result of different versions of shell executable.

My corrected version is listed below (slightly updated, since I have to call exp / imp under hi-privileged account).

Usage is easy:

./export.sh -p [dba_user_password] -f [export_file_name] -s [schema_name1,schema_name2,...]
./import.sh -p [dba_user_password] -f [export_file_name] -o [from_schema_name] -t [to_schema_name]

Also, some useful information could be found at:
http://www.dbaexpert.com/blog/2008/04/comprehensive-shell-script-to-export-the-database-to-devnull/
and http://dbamac.wordpress.com/2008/08/01/running-sqlplus-and-plsql-commands-from-a-shell-script/

===============  export.sh  =================


#!/usr/bin/ksh
# export.sh - Korn Shell script to export (an) Oracle schema(s).
# January 6, 2005
# Author - John Baughman
# Hardcoded values: copy_dir, pipefile, dba
# Jun 23, 2011
# Changed by SK
dba=sys

copy_dir=/home/oracle/dbcopy/
# Check the dump directory exists
if [[ ! -d "${copy_dir}" ]];then
echo
echo "${copy_dir} does not exist !"
echo
exit 1
fi

pipefile=~/backup/backup_pipe
# Check we have created the named pipe file, or else there is no point
# continuing
if [[ ! -p ${pipefile} ]];then
echo
echo "Create the named pipe file ${pipefile} using "
echo " $ mknod ${pipefile} p"
echo
exit 1
fi

# Loop through the command line parameters
# -p - Currently the PSG user password.
# -d - The Oracle SID of the database to export. This matches the TNS entry.
# -s - The schema to import
# If either/all of these don't exist, prompt for them.
# Now, check the parameters...
while getopts ":p:f:s" opt; do
case $opt in
p ) psg_password=$OPTARG ;;
f ) raw_name=$OPTARG ;;  #raw_name=${OPTARG##/*/} ;;
s ) shift $(($OPTIND-1))
raw_schema=$* ;;
\? ) print "usage: export.sh -p password -s schema[...]"
exit 1 ;;
esac
done

# Get the missing ${dba} user password
while [[ -z ${psg_password} ]]; do
read psg_password?"Enter ${dba} Password: "
done

# Get the missing SID
#while [[ -z ${sid} ]]; do
#read sid?"Enter Oracle SID: "
#done

# Get the missing "raw" name
while [[ -z ${raw_name} ]]; do
read raw_name?"Enter the export dump file name: "
done



while [[ -z ${raw_schema} ]]; do
print "Enter the schema(s)"
read raw_schema?"(if more than one schema, they can either be space or comma delimited): "
done

# Fix up the raw schema list
schema_list=""
#raw_schema=$(print $raw_schema tr "[a-z]" "[A-Z]")
for name in $raw_schema; do
if [[ -z $schema_list ]]; then
schema_list="${name}"
else
schema_list="${schema_list},${name}"
fi
done
raw_schema=${schema_list}
schema_list="(${schema_list})"

# Fix up the export file name here...
export_file=${copy_dir}${raw_name}.Z

# Let's go!!!
print ""
print "*******************************************************"
print "* Exporting ${raw_schema} schema(s) at `date`"
print "*******************************************************"
print

compress < ${pipefile} > ${export_file} &
exp \'${dba}/${psg_password} as sysdba\' direct=y log=${copy_dir}${raw_name}.exp.log statistics=none buffer=1000000 feedback=10000 file=${pipefile} owner=${schema_list}
#exp ${dba}/${psg_password}@${sid} direct=y log=${copy_dir}${raw_name}.exp.log statistics=none buffer=1000000 feedback=10000 file=${pipefile} owner=${schema_list}

print
print "*******************************************************"
print "* Export Completed at `date`"
print "*******************************************************"
print

exit 0



===============  import.sh  =================

#!/usr/bin/ksh
# import.sh - Korn Shell script to import (an) Oracle schema(s).
# January 6, 2005
# Author - John Baughman
# Hardcoded values: copy_dir, pipefile, dba
# Jun 23, 2011
# Changed SK

dba=sys

copy_dir=/home/oracle/dbcopy/
# Check the dump directory exists
if [[ ! -d "${copy_dir}" ]];then
echo
echo "${copy_dir} does not exist !"
echo
exit 1
fi

pipefile=~/backup/backup_pipe2
# Check we have created the named pipe file, or else there is no point
# continuing
if [[ ! -p ${pipefile} ]]; then
echo
echo "Create the named pipe file ${pipefile} using "
echo " $ mknod ${pipefile} p"
echo
exit 1
fi

# Loop through the command line parameters
# -p - Currently the PSG user password.
# -d - The Oracle SID of the database to export. This matches the TNS entry.
# -f - The export file to import.
# If either/all of these don't exist, prompt for them.
# Now, check the parameters...
while getopts ":p:f:o:t" opt; do
case $opt in
p ) psg_password=$OPTARG ;;
f ) raw_name=${OPTARG##/*/} ;;
o ) from_user=$OPTARG ;;
t ) shift $(($OPTIND-1))
to_user=$* ;;
\? ) print "usage: import.sh -p password -f dump_file.dmp -o from_user -t to_user"
exit 1 ;;
esac
done

# Get the missing ${dba} user password
while [[ -z ${psg_password} ]]; do
read psg_password?"Enter ${dba} Password: "
done

# Get the missing SID
#while [[ -z ${sid} ]]; do
#read sid?"Enter Oracle SID: "
#done

# Get the missing "raw" name
while [[ -z ${raw_name} ]]; do
read raw_name?"Enter the export dump file name: "
done

# Get the missing "from_user" name
while [[ -z ${from_user} ]]; do
read from_user?"Enter the from_user name: "
done

# Get the missing "to_user" name
while [[ -z ${to_user} ]]; do
read to_user?"Enter the to_user name: "
done

# Fix up the export file name here...
export_file=${copy_dir}${raw_name}.Z

print
print "*******************************************************"
print "* Importing ${export_file} at `date`"
print "*******************************************************"
print

uncompress < ${export_file} > ${pipefile} & imp \'${dba}/${psg_password} as sysdba\' log=${copy_dir}${raw_name}.imp.log statistics=none buffer=1000000 feedback=10000 file=${pipefile} ignore=y fromuser=${from_user} touser=${to_user}

print
print "*******************************************************"
print "* Import Completed at `date`"
print "*******************************************************"
print