Access csv dao delimited file import schema.ini

Loading in CSV files with variable columns

In certain specialised application in Access there will be features allowing users to specify, say, a custom hierarchy. Mainly these are used for reporting so amounts can be rolled up. Typically these are synchronized with the source data which will come from a external source. Now this can mean that the import/export specs in the Access front end will need to change.

What if you wanted to avoid that? How would you import in a CSV file with the new data layout into the database without having to change a import spec or a line of code!

The trick is, of course, to use the meta-data (data about data) to control and validate the input. And using the ability of DAO to open a recordset on a file along with schema data this can be done fairly easily. Here’s one way that I did it.

DAO has the ability to open a delimited file without using a import spec if there is a file named schema.ini in the same folder as the file being imported. We wouldn’t really want our users to manipulate this file so we’ll need to create it using code.

The file looks like this:

Format=CSVDelimited | TabDelimited | Fixedwidth
Col<n>=<ColName> <datatype>

A sample file looks like so

Col1=LName  char width 20
Col2=FName char width 20
Col3=DOB date
Col4=Children integer

Detailed information on schema.ini file contents can be found at msdn.

Sample code to create this file

dim oFS as new FileSystemObject
dim oFileStream as TextStream
dim nColNum as long

set oTextStream = oFS.OpenTextFile("c:\test\schema.ini", ForWriting, True)
oTextStream.WriteLine "[test.csv]"
oTextStream.WriteLine "Format=CSVDelimited"
oTextStream.WriteLine "ColNameHeader=True"

dim oRS as dao.recordset
'need to read our meta-data to get the names of the columns and datatypes
set oRS = currentdb.openrecordset("select ent_name, ent_type from entities")

nColNum = 1
while not oRS.EOF
     oTextStream.Writeline "Col" & nColNum & "=" & oRS.Fields("ent_name") & " " & oRS.Fields("ent_type") & iff(oRS.Fields("ent_type")="width 50"
    nColNum = nColNum +1

You obviously wouldn’t use hard-coded names and such, but I was in a rush writing this. As well you’d probably want to store more information on text widths.

This can then be imported using the docmd.transfertext command. However, if you need to run through the file like a recordset then you can use DAO to open a recordset on the file.The SQL to open the file with DAO looks like this.

select <FIELDS> from [Text;FMT=Delimited;HDR=YES;DATABASE=<FILEPATH>;].[<FILE>#<EXT>];

<FIELDS> is typically set to *
<FILEPATH> is the folder the file resides in
<FILE> is the filename without extension, <EXT> being the extension.

To open such a recordset all you then do is

dim oRS as dao.recordset
set oRS = currentdb.OpenRecorset("select * from [Text;FMT=Delimited;HDR=YES;DATABASE=c:\test;].[test#csv]")

And then you can process the data in the file. I used this technique to great success when processing variable data that would end up in a master-detail relationship. The first few columns were known and fixed, the remained stored in meta-data. These remaining columns were then pivoted at load time and then the data merged into the existing data in the database table.

I hope you can all make use of this feature.

consumption electricity green power virtualisation

Why server virtualisation is important

The other day our electricity bill arrived and uh, well it was quite large. It quite shocked us. And it also showed us that we were using a lot more power than we were last year. I began to think what had changed which so increased our power consumption over the last year.

In the past year I had just started to leave the servers, all five of them, on 24/7. I am hypothesising that that is the difference. I don’t have a energy meter on each server but some have 4 fans and five hard disks. In theory they should spin down over time, but whenever I’ve checked it hasn’t happened. And with a small office the servers are only utilized 25% of time, max.

So now I have begun a crusade to reduce our power consumption. I’ve already shutdown the Asterisk server and have all phones going direct to the VOIP provider. I have shutdown the Windows server which hadn’t been used in two months. I moved services from the backup server to the NAS server. I configured the backup server to turn off at 8am everyday – I just have to remember to turn it on before I go.

Now I need to move services from my webmail server to the NAS machine. That will make one machine responsible for DNS, DHCP, LDAP, Samba, e-mail, IMAP, WEBDAV and Squirrel mail.

When I need a server turned on, to do Oracle or Umbraco or SQL Server testing, then I’ll just turn it on when needed. I don’t know what possessed me to build so many servers…no real use is it? This is why virtualisation is so important. Under utilised machines now become utilized and less servers means less power.

Glad I thought of it! (lol) Just being cheeky.

ado appendchunk images MS Access

Getting images and icons into a Access column

Last post I showed how to extract icons out of a table column. Sorting data in this way saves a lot of room as compared to storing a OLE object wrapping the image. But how do you get the image into the column in the first place?

That’s what I’m gonna show today.

First make sure you have Edwardo Morcillio’s essential olelib.tlb file. His site contains many useful tools and code. You might also find the ImageLoad with GDI+ useful.

First to load an image from the file you’ll need to do something like this:

Dim oPicture As New StdPicture

Set oPicture = LoadPicture(sfilePath) 'the path to the image file here (.ico, .bmp)

Now you have the image loaded you need to convert it into a stream of bytes to put into your column.

Dim bytedata() As Byte

Picture2Array oPicture, bytedata

Where Picture2Array is defined as


Private Declare Function GlobalAlloc Lib "kernel32" (ByVal wFlags As Long, ByVal dwBytes As Long) As Long
Private Declare Function GlobalSize Lib "kernel32" (ByVal hMem As Long) As Long
Private Declare Function GlobalLock Lib "kernel32" (ByVal hMem As Long) As Long
Private Declare Function GlobalUnlock Lib "kernel32" (ByVal hMem As Long) As Long
Private Declare Sub CopyMemory Lib "kernel32" Alias "RtlMoveMemory" ( _
lpvDest As Any, lpvSource As Any, ByVal cbCopy As Long)

Const PictureID = &H746C&

Private Type PictureHeader
Magic As Long
Size As Long
End Type

Public Sub Picture2Array(ByVal oObj As StdPicture, aBytes() As Byte)
Dim oIPS As IPersistStream
Dim oStream As IStream
Dim hGlobal As Long
Dim lPtr As Long
Dim lSize As Long
Dim Hdr As PictureHeader

Set oIPS = oObj ' Get the IPersistStream interface
Set oStream = CreateStreamOnHGlobal(0, True) ' Create a IStream object on global memory
oIPS.Save oStream, True ' Save the picture in the stream
hGlobal = GetHGlobalFromStream(oStream) ' Get the global memory handle from the stream
lSize = GlobalSize(hGlobal) ' Get the memory size
lPtr = GlobalLock(hGlobal) ' Get a pointer to the memory
If lPtr Then
lSize = lSize - Len(Hdr)
ReDim aBytes(0 To lSize - 1)
CopyMemory aBytes(0), ByVal lPtr + Len(Hdr), lSize ' Copy the data to the array
End If
GlobalUnlock hGlobal ' Release the pointer
Set oStream = Nothing ' Release the IStream object
End Sub

That’s got the image in memory in a byte array. Now we need to put this into a OLEObject column in the database. Note that we are NOT putting a OLE object into that column. But Access won’t care, as long as we don’t put it on a form and try to access it that way.

The only way to put the byte array into the column is via the ADO recordset AppendChunk method. Because I am dealing with small images here I just put it in using a single AppendChunk call. If you’re dealing with larger images, say anything over 64K, you may want to break it up into 64K chunks. Look at How To Read and Write BLOBs Using GetChunk and AppendChunk at the Microsoft support site for more information on that.

For me it was a simple case of;

Dim oadoRS as New ADODB.Recordset

oadoRS.Open "select * from ImagesTable", CurrentProject.Connection
oadoRS(0) = 100 'a sample image ID
oadoRS(1).AppendChunk bytedata

There you have it very simple. This proves much easier to handle than if you had to supply the icon and image files and install them to the user’s machine. And its much faster!