Dealing with Memo files that contain binary data

Jul 24, 2012 at 4:28 PM

I've got a table that stores files in a memo field. I need to retrieve this file and dump it into a byte[]. Unfortunately, I'm not actually able to retrieve the entire file (in one test I only get around 600 bytes of around 40K). I'm assuming it's getting truncated because of the encoding, but I'm not sure if there are any work arounds.

For the EF provider, I'm basically doing something like this:

var document = Context.ExecuteStoreQuery<SimpleDocument>(sql, key).FirstOrDefault();

I thought I might be able to work around it by doing a STRCONV() on the column. Unfortunately that just results in a "string is too long to fit" exception when ExecuteStoreQuery is run.

Any ideas on how to make this work? 

Jul 24, 2012 at 4:53 PM

 OK, this is a bit weird but I got it working. I realized the string too long error was because I wasn't casting the result back to a memo. I attempted to convert it to UTF-8 but it still clipped the text. Instead, I convert it to a Base 64 encoding string, then convert it back on the .NET side of things.

Essentially my query now looks like this:

string sql = @"SELECT CAST(STRCONV(Document, 13) AS M) AS Document
                 FROM Document
                WHERE id = ?";

On the .NET side of things I just convert it back:

byte[] bytes = Convert.FromBase64String(doc.Document);

Kind of a round about way of doing this, but so far it seems to work.

If there is a better way to handle this I'd love to hear it.

Jul 24, 2012 at 7:40 PM

(sigh) Well, it works for some files. But it still fails for others. So much for that idea.

Coordinator
Jul 24, 2012 at 7:47 PM

Could you send me an example project so that I can debug the issue?

Jul 24, 2012 at 8:19 PM

I think I figured it out. The codepage on this other file is different. which was causing the problems. I adjusted the codepage and suddenly it's working again.