Passing a set to a stored procedure

Passing a set to a stored procedure

Post by Larry Coo » Wed, 22 Mar 2000 04:00:00



Is there a way (or a workaround that works
well) to pass a set of values into a stored
procedure?  Something like:

create procedure sp_here_is_what_i_want

as
   select *
   from my_table

   return

Larry Coon
University of California


 
 
 

Passing a set to a stored procedure

Post by Mark A. Parson » Thu, 23 Mar 2000 04:00:00



> Is there a way (or a workaround that works
> well) to pass a set of values into a stored
> procedure?  Something like:

> create procedure sp_here_is_what_i_want

> as
>    select *
>    from my_table

>    return

You didn't mention what version of Sybase you're running?  ASE 12 might
allow something like this ... but I'm not sure as I haven't had a chance
to play with 12, yet.

Assuming you're not running ASE 12 ... your example won't work (duh,
Mark?!?! :-)

A couple work arounds include:

1) Put your 'set of values' into a #temp table and do a join between
my_table and #temp; the idea is to put each 'value' into it's own row

2) Pass all of your 'values' into the stored proc as individual arg's;
populate a #temp table with said values; and see #1 (above)

3) Pass all of your 'values' into the stored proc as individual arg's;
run through a loop running your 'select' with each of the possible
'values' as the search argument in your 'where' clause

4) Pass all of your arg's into the stored proc as a single string (e.g.,
your example) and look at using one of the system-supplied string
functions to do some sort of match, e.g., "charindex()", "patindex()",
etc. (see the manuals for a complete list of string functions)

5) ... ???

Which method you use, of course, depends on your environment, e.g., how
often do you need to code this functionality, how many 'values' must you
fit into that one argument (there's a 255 byte limit on character string
arguments), how many rows in my_table (#4 won't use an index so you
could have a long wait while you scan a large table), etc.

--
Mark Parsons
Pegasys Ltd

 
 
 

Passing a set to a stored procedure

Post by Jason Webste » Thu, 23 Mar 2000 04:00:00



>Is there a way (or a workaround that works
>well) to pass a set of values into a stored
>procedure?

/*You need to create the temp table in order to create the proc, but then
you can drop it. */
Create table #tmp (pk_val numeric(10))
go

/* Create the proc */
Create proc myproc as
select *
from mytable a, #tmp b
where a.pk_val=b.pk_val
return
go

/*Now once the proc is in place, you can use it at any time by doing this*/
/*Create the temp table */
Create table #tmp (pk_val numeric(10))
go

/*Put some data in it. Use any means you like. I did it this way here for
simplicity*/
insert #tmp
select 1
union select 2
union select 3
go

/*Execute the proc*/
exec myproc
go

Jason Webster

 
 
 

Passing a set to a stored procedure

Post by Larry Coo » Thu, 23 Mar 2000 04:00:00



> You didn't mention what version of Sybase you're running?  ASE 12 might
> allow something like this ... but I'm not sure as I haven't had a chance
> to play with 12, yet.

Argh, sorry.  11.9.3 on AIX.

Quote:> Assuming you're not running ASE 12 ... your example won't work (duh,
> Mark?!?! :-)

:-)

Quote:> A couple work arounds include:

> 1) Put your 'set of values' into a #temp table and do a join between
> my_table and #temp; the idea is to put each 'value' into it's own row

That's what I originally thought of, but I was hoping
for a better way than incurring the overhead of a
temp table creation, an insert and a select.

Quote:> 2) Pass all of your 'values' into the stored proc as individual arg's;
> populate a #temp table with said values; and see #1 (above)

Individual args won't work, as their number are so
variable.  It literally could be a thousand.

Quote:> 3) Pass all of your 'values' into the stored proc as individual arg's;
> run through a loop running your 'select' with each of the possible
> 'values' as the search argument in your 'where' clause

Same as above.  Too many elements in the set.

Quote:> 4) Pass all of your arg's into the stored proc as a single string (e.g.,
> your example) and look at using one of the system-supplied string
> functions to do some sort of match, e.g., "charindex()", "patindex()",
> etc. (see the manuals for a complete list of string functions)

That's the OTHER thing I thought about, but given the
number of items and their size (each is a key, and is
declared as numeric(15)) I'd have to use the text
datatype, so most of the string functions are out.
Besides, the overhead of concatenating and then parsing
a string like this may even be worse then the overhead
of using a temp table.

Quote:> 5) ... ???

Yeah, this is where I ended up when I decided to post.

:-)

Quote:> Which method you use, of course, depends on your environment, e.g., how
> often do you need to code this functionality, how many 'values' must you
> fit into that one argument (there's a 255 byte limit on character string
> arguments), how many rows in my_table (#4 won't use an index so you
> could have a long wait while you scan a large table), etc.

The situation is that there is a very large, multi-
table join with ugly SARG's needed to identify a
set of rows in a table (call it table A).  Once I
have the primary keys I need from A (each, as I
said, is an identity column declared numeric(15).
There could be as few as ten or so, or as many as
a thousand or so) I need to give the list of them
two three different routines which do something
different based on the set of keys.  Doing the
initial join to give me the list of keys in A once
and only once saves me a lot of time, if I could
then use the result in each of the three procs.
It's also not feasible to combine the three procs.

Thanks!

Larry Coon
University of California


 
 
 

Passing a set to a stored procedure

Post by Larry Coo » Thu, 23 Mar 2000 04:00:00


(solution snipped)

Jason,

Thanks for the reply.  As I already replied to Mark,
I was hoping there might be a way that avoids the use
of temp tables.  I guess not.

Thanks!

Larry Coon
University of California


 
 
 

Passing a set to a stored procedure

Post by Mark A. Parson » Fri, 24 Mar 2000 04:00:00


...snip...

Quote:> That's the OTHER thing I thought about, but given the
> number of items and their size (each is a key, and is
> declared as numeric(15)) I'd have to use the text
> datatype, so most of the string functions are out.
> Besides, the overhead of concatenating and then parsing
> a string like this may even be worse then the overhead
> of using a temp table.

Actually, some of the string functions *do* work on text datatypes,
however, you won't be able to pass a 'text' argument to a stored proc.
In other words, you'd have to build the text value and place it in a
table ... then do a query against it.

Quote:> There could be as few as ten or so, or as many as
> a thousand or so) I need to give the list of them

*whoah*  "as many as a thousand or so"??

I think you're going to find that the time to build/concatenate a
"thousand or so" keys into one large text string ... combined with the
time to run through (search) this string in a query ... will probably
take longer than the use of a temporary table.

Quote:> two three different routines which do something
> different based on the set of keys.  Doing the
> initial join to give me the list of keys in A once
> and only once saves me a lot of time, if I could
> then use the result in each of the three procs.
> It's also not feasible to combine the three procs.

You didn't mention how often you do this ... once/day, 1000
times/minute, ?? ... so FWIW ...

What you're describing is exactly what #temp tables are good at, i.e.,
storing intermediate result sets for further calculations/queries.  And
since they are in 'table' form you can query them just like other tables
(as opposed to the process of building a text string and then searching
it??).

So, how 'bad' is the contention in your tempdb database if you *do* use
#temp tables?  Is it worse than the headaches of building a big honkin's
text string and then searching it?

If you don't like the idea of creating/using #temp tables ... what about
setting up a set of 'permanent' scratch tables?  You could control their
use (by clients) by checking them in and out via a table used to show
which client owns which table during a specific period of time.  (Ask if
this doesn't make sense and I'll try to detail it.)  You could also add
indices to these tables and give the clients the ability to 'update
stats' after populating them with the set of desired keys.

Right  off the bat it sounds like #temp tables (or 'permanent' scratch
tables) are your best bet.  You want to find an intermediate set of keys
... and it's too 'expensive' to do this every time you need them ... so
you want to store them *somewhere*, for future reference in other
queries.  Concatenating them in a big honkin' string doesn't quite make
sense (?) since you have to spend time to build the string and then
spend time manually searching through it for the desired key(s).
Storing them in the client, or in a flat file, won't do much good as now
your application has to serve as the 'join' engine between the set of
keys and the additional SQL statements ... *ick*.

Unless you're finding that you have way too much contention in tempdb
I'd suggest you go with #temp tables.  And if you *do* end up with
contention in tempdb ... we can look at some ways to alleviate that ...
but we'll need a better idea of your operating environment and tempdb
traffic.

--
Mark Parsons
Pegasys Ltd

 
 
 

Passing a set to a stored procedure

Post by Jason Webste » Fri, 24 Mar 2000 04:00:00


I hear you. Since I started working with Sybase databases in 94 (mssqlserver
4.21), I've wanted the ability to pass a recordset to a stored procedure.
Often I need to pass sets -- like the detail of a purchase order for
example -- to a stored proc and have it process the set. I like to keep the
transactions on the backend so I don't like the practice of issuing a begin
tran in the client, repetitively calling a backend stored proc with each
row, and then issuing a commit from the client. Unfortunately, since you
can't pass a set to a proc -- (there's not even support for arrays in
t-sql), this is at best a cumbersome task to pull off.


>Jason,

>Thanks for the reply.  As I already replied to Mark,
>I was hoping there might be a way that avoids the use
>of temp tables.  I guess not.

>Thanks!

>Larry Coon

Jason Webster
 
 
 

Passing a set to a stored procedure

Post by Larry Coo » Fri, 24 Mar 2000 04:00:00



> Actually, some of the string functions *do* work on text datatypes,
> however, you won't be able to pass a 'text' argument to a stored proc.
> In other words, you'd have to build the text value and place it in a
> table ... then do a query against it.

Right, which is pointless since the point of this
method was to avoid using a table in the first
place...

Quote:> *whoah*  "as many as a thousand or so"??

> I think you're going to find that the time to build/concatenate a
> "thousand or so" keys into one large text string ... combined with the
> time to run through (search) this string in a query ... will probably
> take longer than the use of a temporary table.

I agree.  I wasn't even going to bother prototyping
this method, but I will prototype & benchmark the
#temp table method.

Quote:> You didn't mention how often you do this ... once/day, 1000
> times/minute, ?? ... so FWIW ...

I'm guessing about 100 times a day, once this system
is in production.

Quote:> What you're describing is exactly what #temp tables are good at, i.e.,
> storing intermediate result sets for further calculations/queries.

Yeah, I was just hoping that there was a better way
of doing argument passing like I want to do than using
#temp tables as an intermediary.  But from what I've
heard, this sounds like the best approach.

Quote:> And
> since they are in 'table' form you can query them just like other tables
> (as opposed to the process of building a text string and then searching
> it??).

> So, how 'bad' is the contention in your tempdb database if you *do* use
> #temp tables?  Is it worse than the headaches of building a big honkin's
> text string and then searching it?

Also agreed.  Who wants to do string parsing when it
can be avoided?

Quote:> If you don't like the idea of creating/using #temp tables ... what about
> setting up a set of 'permanent' scratch tables?  You could control their
> use (by clients) by checking them in and out via a table used to show
> which client owns which table during a specific period of time.  (Ask if
> this doesn't make sense and I'll try to detail it.)  You could also add
> indices to these tables and give the clients the ability to 'update
> stats' after populating them with the set of desired keys.

This seems like way more trouble than it's worth.  The
benefit to #temp tables is that each user gets his/her
own table, so you don't have to worry about reading the
other guy's rows.

It's not that I don't "like" the idea of using temp
tables, it was that: 1) I wanted to see if there was
a more efficient way, or at least a more "standard"
way, and 2) It didn't seem like the right tool for
the job.  I use #temp tables a lot within procedures
to build result sets, but I've never used them
between procedures as a method for argument passing.

Quote:> Right  off the bat it sounds like #temp tables (or 'permanent' scratch
> tables) are your best bet.  You want to find an intermediate set of keys
> ... and it's too 'expensive' to do this every time you need them ... so
> you want to store them *somewhere*, for future reference in other
> queries.  Concatenating them in a big honkin' string doesn't quite make
> sense (?) since you have to spend time to build the string and then
> spend time manually searching through it for the desired key(s).
> Storing them in the client, or in a flat file, won't do much good as now
> your application has to serve as the 'join' engine between the set of
> keys and the additional SQL statements ... *ick*.

> Unless you're finding that you have way too much contention in tempdb
> I'd suggest you go with #temp tables.  And if you *do* end up with
> contention in tempdb ... we can look at some ways to alleviate that ...
> but we'll need a better idea of your operating environment and tempdb
> traffic.

Thanks,

Larry Coon
University of California


 
 
 

Passing a set to a stored procedure

Post by Anthony Mandi » Sat, 25 Mar 2000 04:00:00



> Yeah, I was just hoping that there was a better way
> of doing argument passing like I want to do than using
> #temp tables as an intermediary.  But from what I've
> heard, this sounds like the best approach.

        Can you describe in a little more detail, the
        exact nature of your ultimate problem or what
        what sort of processing is going on that lead
        you to this "solution"? Chances are that there
        may be a more elegant solution that bypasses
        this entirely.

-am

 
 
 

Passing a set to a stored procedure

Post by Anthony Mandi » Sat, 25 Mar 2000 04:00:00



> I hear you. Since I started working with Sybase databases in 94 (mssqlserver
> 4.21),

        Which doesn't really count but reminds me of my Jeff Wong annecdote
        #247 "Jeff Wong and the Windows NT/MS SQL Server saga" which happened
        around about the same time. But I'll spare everyone the details of the
        annecdote for now (unless someone could use a good laugh, in which
        case I'll post it).

-am

 
 
 

Passing a set to a stored procedure

Post by Larry Coo » Sat, 25 Mar 2000 04:00:00


Anthony Mandic wrote:
>         Can you describe in a little more detail, the
>         exact nature of your ultimate problem or what
>         what sort of processing is going on that lead
>         you to this "solution"? Chances are that there
>         may be a more elegant solution that bypasses
>         this entirely.

Sure, but this may take a while.

This database (and our project) deals with student transfer
between the California community colleges, University of
California system, and California State University system.
There are about 160 public institutions of higher education
in California, and there are agreements in place (called
"articulation agreements") which specify how courses taken
at one institution can be used at another institution when
a student transfers.  An articulation agreement can be a
lengthy and complex document -- some agreements between
two schools are 300 pages long.

Ultimately, the basic "unit" of articulation is called an
"instance."  An instance can be very simple, such as
MATH 1 here can be used in lieu of MATH 101 there.  An
instance can also be complex, with six courses on one side,
nine on the other, and half a dozen restrictions about how
they can be used.

To complicate matters further, we need to keep track of how
things change over time.  For example, an instance contains
a foreign key to a course.  If I display an articulation
agreement for the 99-00 academic year, I need to know that
the course was called "MATH 1" in that year, but if I
display the same agreement for the 97-98 academic year, I
need to know that the same course was called "MATH 11A" at
that time.  The course attributes that can change over
time (independently) are its name, title, unit value, UC
transferability, CSU transferability, lab component and
cross-listing in multiple departments.  In addition to
courses changing over time, institution information (name
and term type), departments, course prefixes, majors and
articulation instances all change over time while
maintaining their identity.

An articulation agreement can be viewed by department or
by major.  In either case, it's the same instances, but
the organization is different.

If you want an example of what one of these looks like,
here is a small one -- between American River Colleage
and UC Santa Cruz for 97-98 for the Business Management
Economics major:

http://www.assist.org/cgi-bin/7g31.pl?event=19&pgid=Page4_aa&kind=3&o...

(sorry if that line wrapped)

The general site, from which any agreement can be
displayed is: http://www.assist.org

Okay, the tables involved, with approximate row counts,
are as follows.  These aren't all of the tables in the
database, just the primary ones used to produce the
information for an articulation agreement:

a_art_aa: One row for each articulation agreement (10,000
rows)

a_art_instance: One row for each instance of articulation
(3 million rows)

a_major: one row for each major at each institution (3,000
rows)

a_major_hist_namechange: historical name change information
for majors (400 rows)

a_art_aa_instance_major: Represents ternary relationship
between agreements, instances and majors (6 million rows)

a_art_aa_instance_signoff: represents approval dates for
instances as they apply to agreements (4.5 million rows)

a_art_instance_r_course: represents the many-to-many
relationship between instances and courses on the receiving
side (2.5 million rows)

a_art_instance_s_course: represents the many-to-many
relationship between instances and courses on the sending
side (1.5 million rows)

a_art_instance_r_text: One instance can have many pieces
of text that can appear in addition to or instead of
receiving courses (eg: "No Course Articulated" in the
sample I gave) (750,000 rows)

a_art_instance_s_course: same as the above for the
sending side (1.5 million rows)

a_articulation_footnote: one row for each footnote
associated with an instance (200,000 rows)

a_art_instance_footnote: Represents many-to-many
relationship between a_art_instance and
a_articulation_footnote (500,000 rows)

a_art_instance_comment: one row for each comment
associated with an instance (700,000 rows)

a_course: One row for each course (500,000 rows)

a_course_hist_name: tracks course name changes over time
(20,000 rows)

a_course_hist_title: tracks course title changes over
time (15,000 rows)

a_course_hist_units: tracks course unit changes over
time (5,000 rows)

a_course_hist_uc: tracks UC transferability changes
over time (3,000 rows)

a_course_hist_csu: tracks CSU transferability changes
over time (4,000 rows)

a_course_hist_lab: tracks lab component changes
over time (500 rows)

a_course_sameas: tracks cross-listing of courses
(20,000 rows)

a_department: one row for each department at each
institution (11,000 rows)

a_dept_hist_namechange: tracks historical name
changes for departments (500 rows)

a_course_prefix: one row for each course prefix
(eg: MATH, BIOL, HIST) at each institution (15,000
rows)

a_dept_course_prefix: tracks many-to-many
relationship between departments and course
prefixes (14,000 rows)

a_institution: one row for each institution (160
rows)

a_inst_hist_namechange: tracks historical name
changes for institutions (10 rows)

a_ay_date: one row for each academic year (50
rows)

a_term_date: one row for each term (200 rows)

Okay, if you're still with me, to display an
articulation agreement by major for one particular
major (like the example you saw), you need to know:
  The sending instituion
  The receiving institution
  The academic year
  The major

To find out the instances involved in an agreement
by major, knowing the above, you need to join:
  a_art_aa
  a_institution (twice)
  a_art_instance
  a_art_aa_instance_major
  a_art_aa_instance_signoff
  a_major

Then for each instance, you need to:
  1. Get the info for all the sending courses
     involved in that instance.  The info must be
     current of the academic year specified.
  2. Same for receiving courses
  3. Same for sending & reciving text
  4. Same for comments and footnotes.

So for each instance, you need to do a join of:
  a_art_instance
  a_art_instance_s_course
  a_art_instance_r_course
  a_course (twice)
  a_art_instance_s_text
  a_art_instance_r_text
  a_art_instance_comment
  a_art_instance_footnote
  a_articulation footnote

And for each course in each instance, in order to
get the right set of attributes, you need to join:
  a_course
  a_course_hist_name
  a_course_hist_title
  a_course_hist_units
  a_course_hist_uc
  a_course_hist_csu
  a_course_hist_lab
  a_course_sameas
  a_course_prefix
  a_dept_course_prefix
  a_department
  a_dept_hist_namechange

Once you have a correctly formatted instance with
all the time-correct information, you put all the
instances together in order, add extra things at
the agreement level, and you're done.

We have perl code (using sybperl) to put together
an agreement.  Given the schools, year and major,
it gets the agreement-level info, figures out the
orderd list of instances to appear in the
agreement, then for each instance, formats the
individual instance.  The instance formatting
code does a database call to get the course info
for all the courses for that instance, formatts
them, and returns the formatted info back to the
agreement-formatting code, which, once it's
iterated over the entire list of instances, puts
it all together and returns it for display.

So just counting instances and courses, and
ignoring the rest, let's say there are 20
instances in a specific agreement for a specific
major.  The database calls then are:
  One call for the agreement-level info.
  One call to get the instances.
  For each instance (there are 20 of them)
     One call to get the sending course info
     One call to get the receiving course info

So that's a total of 42 database queries.  We
want to change it to the following:
  One call for the agreement-level info.
  One call to get the instances.
  One call to get ALL the sending course info
  One call to get ALL the receiving course info

Then iterate over the three result sets, rather
than what we currently do, which is to iterate
over the one instance result set and do individual
calls for the course info for that instance.  This
would involve a total of four database queries,
which is a lot better than 42.

Okay, the procedure to get the list of instances
is non-trivial, and takes time to run.  So we
only want to run it once.  Once we have the
ordered list of instances, we want to pass it to
the "get ALL sending course info" and the "get
ALL receiving course info" procedures.  But this
means passing these routines a SET of instance
ID's rather than just one at a time.  That's why
I was asking for an efficient way of passing a
result set to a stored procedure.

Sorry about being so long-winded, but you asked.
:-)

I hope that made sense.

Larry Coon
University of California
la...@assist.org
and lmc...@home.com

 
 
 

Passing a set to a stored procedure

Post by Anthony Mandi » Sun, 26 Mar 2000 04:00:00



> Sure, but this may take a while.

        I was afraid of that!

Quote:> (sorry if that line wrapped)

        It didn't but I didn't check the link. I get the picture.

Quote:> Okay, if you're still with me,

        Yes, so far.

Quote:> Once you have a correctly formatted instance with
> all the time-correct information, you put all the
> instances together in order, add extra things at
> the agreement level, and you're done.

        You make it sound so simple.

Quote:> We have perl code (using sybperl) to put together
> an agreement.  Given the schools, year and major,
...
> So just counting instances and courses, and
> ignoring the rest, let's say there are 20
> instances in a specific agreement for a specific
> major.  The database calls then are:
>   One call for the agreement-level info.
>   One call to get the instances.
>   For each instance (there are 20 of them)
>      One call to get the sending course info
>      One call to get the receiving course info

> So that's a total of 42 database queries.  We
> want to change it to the following:
>   One call for the agreement-level info.
>   One call to get the instances.
>   One call to get ALL the sending course info
>   One call to get ALL the receiving course info
...
> Okay, the procedure to get the list of instances
> is non-trivial, and takes time to run.  So we
> only want to run it once.  Once we have the
> ordered list of instances, we want to pass it to
> the "get ALL sending course info" and the "get
> ALL receiving course info" procedures.  But this
> means passing these routines a SET of instance
> ID's rather than just one at a time.  That's why
> I was asking for an efficient way of passing a
> result set to a stored procedure.

> Sorry about being so long-winded, but you asked.

        Sorry I did. Now I have a headache - my head's swimming
        with instances of agreements :-)

        But seriously, from your description, and without having
        a solid foundation to base my statements on without having
        a good eyeball at the schema design and the table data, it
        looks like there is a top level of the process with only a
        few parameters required. I think this is school, year and
        major (but it doesn't matter if it isn't. My chief point is
        that there is a top level with only a few parameters). Now
        could the processing be written in such a way that these form
        the parameters to one sproc call? You already mention that
        the sproc to get instances is non-trivial. So an encompassing
        sproc would appear to be the mother of all sprocs.

        I figure you'd say no due to the complexity. The next alternative
        would be to change the sproc getting the instance list to return
        the instances and all the course info in another result set. This
        makes it even more non-trivial but not quite as large as the first
        suggestion.

        The last option is to not make an sproc call to get the course
        info. Use an embedded T-SQL call in the sybperl code instead and
        contruct the set within the code and use an in clause.

-am

 
 
 

Passing a set to a stored procedure

Post by Larry Coo » Tue, 28 Mar 2000 04:00:00



> > Sorry about being so long-winded, but you asked.

>         Sorry I did. Now I have a headache - my head's swimming
>         with instances of agreements :-)

Try living this stuff for several years....  :-)

Quote:>         But seriously, from your description, and without having
>         a solid foundation to base my statements on without having
>         a good eyeball at the schema design and the table data, it
>         looks like there is a top level of the process with only a
>         few parameters required. I think this is school, year and
>         major (but it doesn't matter if it isn't. My chief point is
>         that there is a top level with only a few parameters).

Right.  TWO schools, year and major actually, but your
assessment is correct.  There are several parameters
I left out for simplicity (that's right, you got the
SIMPLE version) such as whether to include "draft"
instances, whether to include additional information
useful for the data entry people, whether to format
it right-left or left-right, and a couple of other
things.

The other complication is that this (articulation by
major) is an example of ONE of the report types, but
there are others: articulation by department, articulation
for GE/Breadth, and a couple of maintenance-mode reports.
All use the same low-level code, with different instance
selectivity and different ways of putting the final
report together.

Quote:>         Now
>         could the processing be written in such a way that these form
>         the parameters to one sproc call? You already mention that
>         the sproc to get instances is non-trivial. So an encompassing
>         sproc would appear to be the mother of all sprocs.

There is so much one-to-many data that multiple result
sets seemed logical.  One agreement has many majors,
one major has many instances (actually, major-instance
is many-to-many), one instance has many sending courses
AND many receiving courses AND many footnotes.  So a
single stored proc, with a single result set, just seemed
too unwieldy.  Plus it took away a lot of the code
reusability -- for instance, given a date and the ID of
a course, returning all of the course attributes current
of that date is used not only by the articulation reports,
but also a couple dozen other types of reports we run off
this database.

Quote:>         I figure you'd say no due to the complexity. The next alternative
>         would be to change the sproc getting the instance list to return
>         the instances and all the course info in another result set. This
>         makes it even more non-trivial but not quite as large as the first
>         suggestion.

I think the one-to-manys do us in by doing it this way.
You have a set of instances (average 20) within a major,
and within an instance you have independent sets of sending
and receiving courses.  Each of these sets are 0-50 courses
per instance, but the median is small.  Fitting all of this
into a single result set seemed awkward.

Quote:>         The last option is to not make an sproc call to get the course
>         info. Use an embedded T-SQL call in the sybperl code instead and
>         contruct the set within the code and use an in clause.

The code to construct the course info, as I said, is really
non-trivial.  It really did seem to work better as a stored
proc.  For example, the routine to get the sending course info
for an instance is about 200 lines of (well-formatted and
commented) SQL.

So the best solution I could come up with is the one I
originally asked about -- get all of the sending course
info for one major in one sproc call, get all of the
receiving course info for that major in another sproc
call, and put them all together in perl.  But this
requires the sproc calls to take a set of instance IDs
rather than one instance ID, hence my original question
(once I got passing a set of IDs into the sproc working,
changing "=" to "in" within the proc is trivial).

We have a big demo coming up tomorrow.  Once that is
finished, I'll spend some time getting the technique of
passing the set of IDs in with a #temp table working
and see how well that performs.

Thanks,

Larry Coon
University of California


 
 
 

Passing a set to a stored procedure

Post by Larry Coo » Tue, 28 Mar 2000 04:00:00



> Unless you're finding that you have way too much contention in tempdb
> I'd suggest you go with #temp tables.  And if you *do* end up with
> contention in tempdb ... we can look at some ways to alleviate that ...
> but we'll need a better idea of your operating environment and tempdb
> traffic.

Just to follow-up on this one: Other than what's listed
in chapter 19 of the P&T Guide, is there anything I should
be looking at to reduce tempdb contention?  It's on its
own physical disk, it's sized large enough, and it has its
own data cache.  Any other biggies?

Also, the P&T Guide talks about making temdb big enough,
but doesn't say anything about negative effects resulting
from tempdb being too big.  Can there be any negative
effects if tempdb is too big?  Rather than go through the
formula the P&T Guide gives, I just set tempdb to 1 GB
(we have plenty of disk space to burn).

Larry Coon
University of California


 
 
 

Passing a set to a stored procedure

Post by Mark A. Parson » Wed, 29 Mar 2000 04:00:00


Quote:> Also, the P&T Guide talks about making temdb big enough,
> but doesn't say anything about negative effects resulting
> from tempdb being too big.  Can there be any negative
> effects if tempdb is too big?  Rather than go through the
> formula the P&T Guide gives, I just set tempdb to 1 GB
> (we have plenty of disk space to burn).

Except for the possible contention on the system tables (in the case of
lots of temp tables being created/destroyed in short periods of time)
you can size/design tempdb just like you would for any other database
... based on how it'll be used.

Some things you could look into ... use a RAM disk (will eat up
additional CPU cycles since there'll be additional overhead for managing
RAM) ... use a solid state disk (similar to RAM disk except minimal
overhead to CPU; also costs more than RAM) ... put tempdb on a striped
(RAID 0, or is it RAID 1 ??) disk configuration to spread out I/O's ...
look at putting log and data on separate disks/spindles ... give tempdb
it's own cache ... etc.

One other item to look at is whether you really need the #temp table at
all.  I had one client where the developers learned how to use #temp
tables, however, they went overboard and started using #temp tables all
the time ... when many times the #temp table wasn't needed.  (I'm not
necessarily talking about your situation that got this thread started
... I'm talking about an issue on the 'general' use of #temp tables.)

--
Mark Parsons
Pegasys Ltd

 
 
 

1. Passing a set to a stored procedure?

Depending on what the stored proc does with the list of values, you could
pass them in as a comma separated string in a single input parameter and
parse the string in the stored proc.  The attached example does this to
construct an IN clause.  This approach could be modified to populate a
temporary table that is created in the stored proc.

begin 666 deliminclause.sql
<encoded_portion_removed>

2. FMP on Windows CE

3. Passing a result set from one stored procedure to another stored procedure

4. calling ReportSmith from within Delphi

5. Passing cursor/result set to extended stored procedure

6. Querying a Query

7. Passing arrays and sets to stored procedures

8. Accessing Informix Serial Keys with VB/ADO

9. Pass a CLOB to an external procedure and pass changes back to calling procedure

10. Passing a set to a procedure

11. Calling a Java Stored Procedure from another Java Stored Stored Procedure

12. Calling Procedures and Functions - VFP 6.0--When Set Procedure Doesn't Set