Archive for September, 2009

Deep Copy Array/ Collection/ etc using serialization

A useful snippet of code I used in a web services project a while ago which i just stumbled back across recently- allows you to make a deep copy of an array, where you would normally get a shallow copy, stopping you from manipulating it independently from the source. This gives you a completely fresh and separate copy. The original is serialised to a memory, then a new instance created from the serialised representation (so obviously if it’s an array of custom objects, those objects will need to be marked serializable)

OrderItem[] order_items_clone;
using( MemoryStream ms = new MemoryStream())
{
	System.Runtime.Serialization.Formatters.Binary.BinaryFormatter formatter = new System.Runtime.Serialization.Formatters.Binary.BinaryFormatter();
	formatter.Serialize(ms,order_items);
	ms.Seek(0 , SeekOrigin.Begin);
	order_items_clone = (OrderItem[])formatter.Deserialize(ms);
}		

, , ,

No Comments

Scott Klueppel’s Blog – Error Log RSS Feed

Sam pointed me to an article showing a really nice way to deal with error logging;  as opposed to filling a debug mail box with 100′s of mails that will soon become unmanageably large, have the error’s logged to a database, then published via RSS!

Scott Klueppel’s Blog – Error Log RSS Feed.

, , , ,

No Comments

Synchonous Yield messing up my foreach loop!

I recently got stung while debugging a colleagues code, by my lack of understanding of the Yield keyword! We had a foreach loop which looks like this;

foreach (PurchaseOrderLine aLine in PurchaseOrderLine.LoadExtractLines(extractTime, systemId)) {
  //do some processing stuff...
  PurchaseOrder.MarkAsExported(lastId, extractTime);
}

The LoadExtractLines returns an IEnumerable and uses yield to return each line- the stored proc it runs is quite intensive- the whole thing looks like this;

public static IEnumerable<PurchaseOrderLine> LoadExtractLines(DateTime cutOff,int systemId)
        {
            using (SqlConnection conn = new SqlConnection(SingleAccess.Instance.ConnectionToUse))
            {
                conn.Open();
                using (SqlCommand cmd = new SqlCommand("Get_PurchaseOrderLinesToExtract", conn))
                {
                    cmd.CommandType = CommandType.StoredProcedure;
                    cmd.Parameters.AddWithValue("@CutOff", cutOff.ToString("dd MMM yyyy HH:mm:ss"));
                    cmd.Parameters.AddWithValue("@SystemId", systemId);
                    using (SqlDataReader reader = cmd.ExecuteReader())
                    {
                        while (reader.Read())
                        {
                            yield return LoadLine(reader);
                        }
                    }
                }
            }
        }

MarkAsExported runs quite an intensive update procedure on the database. As more and more data came into the system we started to see Sql Timeouts, and upon running a trace noticed something strange. Logging the RPC:Start and Completed events of the stored procs, the Get_PurchaseOrderLinesToExtract proc which feed’s the for loop was starting, then the update was starting before the Get_ had finished- the two were running side by side, causing the timeout’s!

Turns out the foreach loop started the moment it received it’s first row, yielded back from the LoadExtractLine method- which in retrospect does make sense! The solution was to convert the Load method to populate a local List<> then return the whole thing once complete, removing the yield statement alltoghether and forcing the process to wait for the entire result set to be complete before starting the loop.

, ,

2 Comments