.net 2.0 - What Determines the Encoding used in Xml Serialization? -


I am currently serialing an object using XMLSerializer, and the resulting XML starts:

  & lt ;? XML version = "1.0" encoding = "UTF-16"? & Gt;  

How do I get rid of, because in this special case I do not need it (I will only use serialized string to daserize later with my code, so I am also trying to do it as fast as we are doing tons of these serialization.

So the question is, can I trust this signature I can always say that (? Can I delete the first 39 characters of the string as a result, and then can I add that exact string back to deserializing?)

Or something that can be encoded Be different, for example?

Thanks

Answer your question that code That's what you have not shown to us - how you did the serialization. You might be able to serialize the stringw Sector for Stringbilder. The NTS string UTF-16. If you make a serial for a string, you have no choice but to get UTF -16 encoding.

In other situations, the encoding is determined by the destination if you serialize for a textwriter of some kind, the textwriter encoding will be used until it is not overridden. If you make a serial for an XMLRector, then you use XMLRrittings encoding.

I suggest that you leave the signature alone, unless you are not an expert in XML .NET XML API understands the rules of XML, unless you understand them correctly, I I recommend that you leave it to the specialist.


Comments

Popular posts from this blog

c# - ListView onScroll event -

PHP - get image from byte array -

Linux Terminal Problem with Non-Canonical Terminal I/O app -