Tech Off Thread

4 posts

Value Type/Reference Type Question

Back to Forum: Tech Off
  • User profile image
    Frankie Fresh

    Since I learned that .NET doesn't have primitive types (I'm a reformed Java programmer) and that all data types inherit from Object, this one question has puzzled me.

    What makes the determination that a given value type is set to value or reference type? 

    The answer I got in a book was along the lines of "if you use the new keyword then it's a reference type."  I didn't find that to be a satisfying answer.

    Some colleagues and I have been discussing it and we think that objects without constructor methods become value types and those with become reference types.

    Are we on point? Or completely off the mark?

  • User profile image
    LazyCoder

    Completely off the mark. Basically, what you would think are primitives are value types (with the exception of structs, which are also value types) All others are reference types.

    Check this page out.

    It explains Value types in the CLR.

  • User profile image
    bitmask

    There are a couple wierd concepts in this area Smiley

    Value types must derive from either System.ValueType or System.Enum. The interesting thing is ValueType and Enum are reference types.

    Also, all value types come with a default (parameterless) constructor to initialize a value type to "all bits zero":

    int i = new int(); // legal C#

    And if you declare your own ValueType, you cannot declare a default constructor:

    struct foo
    {
       public foo() // error!
       {
       }
    }

  • User profile image
    Frankie Fresh

    Ok, that clears things up a bit.

    Thanks!

Comments closed

Comments have been closed since this content was published more than 30 days ago, but if you'd like to continue the conversation, please create a new thread in our Forums, or Contact Us and let us know.