CL1 webserver: <Anantsystems<Ad info>

    AspnetEmail.com   AspNetPro.com

related sites: <FREE Help> <ASP> <Asp.net> <worldwide>  
feedback: <lovethat> <hatethat> <thanks> <credits> <contact us>

Why GetRows is best to fetch data Table of Contents PrintView CL1
<Previous> "Pay It Forward" at AspFriends.com

Why Response.Buffer=TRUE is best <Next>


       

GetRows! ... Don't retrieve data any other way!

Why LOOP and Movenext? They just slow things down...

Most people write database retrieval code like his:

<%
' Open Database
Do UNTIL rs.eof
    city=rs("city")
    st=rs("state")
    zip=rs("zip")
    rs.movenext
    ... process and format data ....
LOOP
' Close database
%>

An example of such code is at:
http://www.learnasp.com/learn/dbtable.asp
and 
http://www.learnasp.com/learn/dbsimple.asp

If there are 700 records and 3 columns for example, we have 3,500 database read requests over the wire. 

+2,100 ... Each field read is a request
+700  ..... Each .movenext is +1 request
+700  ..... Each .eof test is +1 request
====
3,500 requests

Lots of round-trips. Actually I am fudging a little here.... the recordset object actually will maintain buffers in chunk sizes defined by rstemp.cachesize -- setting it will determine how many rows of data it buffers when 1 record is requested. A cachesize for example of 50 would reduce the previous examples trips to the backend database to 14 since when you asked for the first record you got 50. Until the MOVENEXT triggers record 51-101 being retrieved. But those buffers, and their attendant structures ain't free either.

How about 1 transfer instead of 3500?
A better alternative code approach is GetRows!

dim myarray
myarray=rstemp.GetRows

One transfer for seven-hundred records x 3 fields for example. We can even close Recordset and Connection BEFORE formatting or calculating and massaging our data.

Imagine the backend gets to make 1 transfer into a 2,100 element array and hand the array back

Very efficient.

Here is a real world sample:

<Test Script Below>

<%@enablesessionstate=false%>
<%response.buffer=true%>
<html><head>
<TITLE>dbtablegetrows.asp</TITLE>
</head>
<body bgcolor="#FFFFFF">
<%
' displays a database in table form via GetRows
myDSN="DSN=Student;uid=student;pwd=magic"
mySQL="select * from publishers where state='NY'"
showblank="&nbsp;"
shownull="-null-"

set conntemp=server.createobject("adodb.connection")
conntemp.open myDSN
set rstemp=conntemp.execute(mySQL)
If  rstemp.eof then
   response.write "No records matched<br>"
   response.write mySQL & "<br>So cannot make table..."
   Call CloseAll
   response.end
end if

response.write "<table border=""1""><tr>" & vbcrlf
'Put Headings On The Table of Field Names
for each whatever in rstemp.fields
   response.write "<td><b>" & whatever.name & "</b></td>" & vbcrlf
next
response.write "</tr>" & vbcrlf

' Now lets grab all the records
alldata=rstemp.getrows
Call CloseAll

if  IsArray(alldata) then
numcols=ubound(alldata,1)
numrows=ubound(alldata,2)
FOR rowcounter= 0 TO numrows
   response.write "<tr>" & vbcrlf
   FOR colcounter=0 to numcols
      thisfield=alldata(colcounter,rowcounter)
      if isnull(thisfield) then
         thisfield=shownull
      end if
      if trim(thisfield)="" then
         thisfield=showblank
      end if
      response.write "<td valign=top>"
      response.write thisfield
      response.write "</td>" & vbcrlf
   NEXT
   response.write "</tr>" & vbcrlf
NEXT
response.write "</table>"
End If
%>

</body></html>
<%
SUB CloseAll
   rstemp.close
   set rstemp=nothing
   conntemp.close
   set conntemp=nothing
END SUB
%>


Read:
http://www.learnasp.com/advice/roundrobin.asp
to get the full sense for the importance of closing resources early.

Get Some Real Code to Use!

If your data formatting is simple:
http://www.learnasp.com/learn/dbtablegetstring.asp
can accomplish even faster results.

Aren't numbered arrays harder for people to read?
Sure so I have prepared an example where we number the fields with variables and make the code easier to read for people. see:
http://www.learnasp.com/learn/dbtablegetrowsnonum.asp
and
http://www.learnasp.com/learn/subgetrows.asp

Old Habits Die Hard... Do you want to be an old dog that won't learn new tricks?
The main reason that people do MOVENEXT and field at a time because that is what they learned. But let us say 300 people hit my main page and it read a database; well if I do my reads without getrows or getstrings I may need a much more powerful CPU as the scripts spin, round-robin and save their context and store their data at a low-level to serve all those users simultaneously. On a webserver every millisecond you waste in one script may be multiplied by tens of thousands of users (Amazon.com, Cnn.com) so it behooves traditional programmers to bite the bullet and throw away one at a time field reads and movenexts.

Notice how my Getrows example can close RS and CN before processing data. Lets say do LOOP and PROCESS/FORMAT in the loop. Well, if the reads take 1.5 seconds and the processing/formatting takes 2.5 seconds the database is open for 4 seconds. With Getrows the same operation retrieve might take 3/4 second and the rs and cn are closed and back in the pool for others and the formatting is done while someone else uses the connection and whatever READ locks placed on the data rows/chunks aren't affecting others while formatting is occurring.

Kyle Dyer writes us... "What about Jscript?"

And he is kind enough to supply us with the results of his research:

strSQL = "SELECT times FROM ...";
recordSet.Open(strSQL, conn);
recordSet.GetString(2,-1,"column-delim","row-delim","null");

parameters:
  1: string format: has to be 2 (must be there in JavaScript)
  2: number rows: -1 gets you all of them (must be there in JavaScript)
  3: column delim
  4: row delim
  5: nullExper

Does it matter for small amounts of data?

YES!!!!!!!!

My site has SQLserver scripts that run like lightning. I once needed to fill a 9 item listbox from Access and got 90 sec script timeouts with movenext. Getstring never timed out. So in a real production situation it makes weak databases feasible and of course reduces the load on more industrial back-ends so maybe the SQLserver doesn't need as many indexes or RAM upgrades.

Is there any faster way to retrieve data?
"The Worlds Fastest Listbox"
http://www.learnasp.com/learn/speedappdata.asp
shows the only faster way to display databases -- by using caching.

Finally I hunkered down and encapsulated the speed of getstring, caches that can be set for durations, and the elegance of named field and flexible formatting ala:
http://www.learnasp.com/learn/rsfast.asp
I would argue that it is the best code samples at our entire site.

Why GetRows is best to fetch data Table of Contents PrintView
<Previous> "Pay It Forward" at AspFriends.com

Why Response.Buffer=TRUE is best <Next>

CL1 webserver: <Anantsystems<Ad info>

    AspnetEmail.com   AspNetPro.com

related sites: <FREE Help> <ASP> <Asp.net> <worldwide>  
feedback: <lovethat> <hatethat> <thanks> <credits> <contact us>