Keywords that I don't understand
-
Josh Gray wrote:
Also this[^] discussion is relevant.
It's also confusing bad macro design with bad design in general. Apples to oranges.
Josh Gray wrote:
but if you think you know better than him or me macro away till the cows come home, I really dont care
Tell me then, since he designed C++, why didn't he drop macro support if he loathed the mere existence of them that much? Compatibility can't be the reason as it was a brand new language at the time, and wouldn't effect C. I can understand making it completely compile C code, but why not specify compilers replace macros with something else under the hood, etc. I mean, why keep them if they're so bad?
Jeremy Falcon "It's a good thing to do and a tasty way to do it." - Wilford Brimley[^]
Jeremy Falcon wrote:
Tell me then, since he designed C++, why didn't he drop macro support if he loathed the mere existence of them that much? Compatibility can't be the reason as it was a brand new language at the time, and wouldn't effect C. I can understand making it completely compile C code, but why not specify compilers replace macros with something else under the hood, etc. I mean, why keep them if they're so bad?
Obviously I could only speculate at this and that would be pointless. It is my preference to avoid macros whenever possible.
-
Stephen Hewitt wrote:
Perhaps you can explain the advantages of macros over inline functions?
Perhaps you can read my posts and find the answer there already seeing as I said it more than once.
Stephen Hewitt wrote:
- Type safety.
Addressed twice. The type safety you refer to isn't a real issue since a macro expands to type safe code.
Stephen Hewitt wrote:
- Automatically disabled in debug builds to aid debugging.
And like typing #ifdef _DEBUG will break your arm.
Stephen Hewitt wrote:
- Can use multiline constructs without having to end each line with a "\".
You know you're reaching deep when you have syntax as reasoning. :laugh:
Stephen Hewitt wrote:
- Can be put into namespaces! - Can be members of classes and structs. - Can be overloaded.
Actually these are very good points, but it doesn't really mean inline functions should always be used in place of macros as you suggest. Besides, this has contradicting logic. I mean, why write an inline function... usually for speed right? Using the second two of your three points quoted would effectively break that anyway.
Jeremy Falcon "It's a good thing to do and a tasty way to do it." - Wilford Brimley[^]
No one says macros should never be used, just that they shouldn’t be used when other superior constructs exist for the same task; the specific example in this case is that macros should not be used as
inline
functions. I have constantly argued the point, complete with examples and data (compiler output) to support my claims; while you seem intent on arguing the person and accusing people of missing the point. Well I've had my say and I'll let my comments speak for themselves.Steve
-
No one says macros should never be used, just that they shouldn’t be used when other superior constructs exist for the same task; the specific example in this case is that macros should not be used as
inline
functions. I have constantly argued the point, complete with examples and data (compiler output) to support my claims; while you seem intent on arguing the person and accusing people of missing the point. Well I've had my say and I'll let my comments speak for themselves.Steve
Stephen Hewitt wrote:
No one says macros should never be used, just that they shouldn’t be use when other superior constructs exist for the same task; the specific example in this case is that macros should not be used as inline functions.
I'm sorry you don't understand context - my bad. I'll have to remember that next time I write something.
Stephen Hewitt wrote:
I have constantly argued the point, complete with examples and data (compiler output) to support my claims;
And that makes you correct how?
Stephen Hewitt wrote:
while you seem intent on arguing the person and accusing people of missing the point. Well I've had my say and I'll let my comments speak for themselves.
Fair enough, but don't accuse me of what you did, and that is to argue a point I never spoke against. Have a nice day.
Jeremy Falcon "It's a good thing to do and a tasty way to do it." - Wilford Brimley[^]
-
No one says macros should never be used, just that they shouldn’t be used when other superior constructs exist for the same task; the specific example in this case is that macros should not be used as
inline
functions. I have constantly argued the point, complete with examples and data (compiler output) to support my claims; while you seem intent on arguing the person and accusing people of missing the point. Well I've had my say and I'll let my comments speak for themselves.Steve
Stephen Hewitt wrote:
No one says macros should never be used, just that they shouldn’t be use when other superior constructs exist for the same task; the specific example in this case is that macros should not be used as inline functions.
Actually, I just reread what I said. I did mention inline functions. Perhaps you should bother reading my posts... oh wait.
Jeremy Falcon "It's a good thing to do and a tasty way to do it." - Wilford Brimley[^]
-
Jeremy Falcon wrote:
Tell me then, since he designed C++, why didn't he drop macro support if he loathed the mere existence of them that much? Compatibility can't be the reason as it was a brand new language at the time, and wouldn't effect C. I can understand making it completely compile C code, but why not specify compilers replace macros with something else under the hood, etc. I mean, why keep them if they're so bad?
Obviously I could only speculate at this and that would be pointless. It is my preference to avoid macros whenever possible.
Josh Gray wrote:
Obviously I could only speculate at this and that would be pointless. It is my preference to avoid macros whenever possible.
Well, let's make a deal. I won't hold it against you if you don't hold it against me for sneaking the occasional macro or two in some code when you're not looking. :-D
Jeremy Falcon "It's a good thing to do and a tasty way to do it." - Wilford Brimley[^]
-
Josh Gray wrote:
Obviously I could only speculate at this and that would be pointless. It is my preference to avoid macros whenever possible.
Well, let's make a deal. I won't hold it against you if you don't hold it against me for sneaking the occasional macro or two in some code when you're not looking. :-D
Jeremy Falcon "It's a good thing to do and a tasty way to do it." - Wilford Brimley[^]
-
Christian Graus wrote:
inline is a *suggestion* to the compiler that this function is so simple
I hate to use your post, but that's a great reason I forgot. There is no real guarantee inline will even work. You're guaranteed to have inline expansion with macros though. Man, old age is getting to me. :-O
Jeremy Falcon "It's a good thing to do and a tasty way to do it." - Wilford Brimley[^]
The only times I've seen MSVC ignore an
inline
directive is when its had no choice such as with recursive functions: in this context this is a feature.Steve
-
The only times I've seen MSVC ignore an
inline
directive is when its had no choice such as with recursive functions: in this context this is a feature.Steve
Stephen Hewitt wrote:
The only times I've seen MSVC ignore an inline directive is when its had no choice such as with recursive functions: in this context this is a feature.
And MSVC is the only compiler in existence too.
Jeremy Falcon "It's a good thing to do and a tasty way to do it." - Wilford Brimley[^]
-
Stephen Hewitt wrote:
The only times I've seen MSVC ignore an inline directive is when its had no choice such as with recursive functions: in this context this is a feature.
And MSVC is the only compiler in existence too.
Jeremy Falcon "It's a good thing to do and a tasty way to do it." - Wilford Brimley[^]
No, it isn't. Until recently it was one of the worst (MSVC 6) in common use. I have inspected the machine code generated by MSVC 6 (I do a lot of postmortem debugging at work) and the inlining works as expected except when that’s not possible, as I mentioned before. The Microsoft compilers after MSVC 6 produce even better code from what I’ve seen. All modern C++ compiler support inlining just fine.
Steve
-
No, it isn't. Until recently it was one of the worst (MSVC 6) in common use. I have inspected the machine code generated by MSVC 6 (I do a lot of postmortem debugging at work) and the inlining works as expected except when that’s not possible, as I mentioned before. The Microsoft compilers after MSVC 6 produce even better code from what I’ve seen. All modern C++ compiler support inlining just fine.
Steve
-
No, it isn't. Until recently it was one of the worst (MSVC 6) in common use. I have inspected the machine code generated by MSVC 6 (I do a lot of postmortem debugging at work) and the inlining works as expected except when that’s not possible, as I mentioned before. The Microsoft compilers after MSVC 6 produce even better code from what I’ve seen. All modern C++ compiler support inlining just fine.
Steve
Stephen Hewitt wrote:
Until recently it was one of the worst (MSVC 6) in common use. I have inspected the machine code generated by MSVC 6 (I do a lot of postmortem debugging at work) and the inlining works as expected except when that’s not possible, as I mentioned before. The Microsoft compilers after MSVC 6 produce even better code from what I’ve seen.
And you just felt the need to say this right? I don't really see how this has anything to do with the point that was quoted - which I'm sure you could've guessed is sarcasm.
Stephen Hewitt wrote:
All modern C++ compiler support inlining just fine.
Nice blanket statement there. Notice the extra little comments in these links that have nothing to do with recursion. http://gcc.gnu.org/onlinedocs/gcc/Inline.html[^] http://www.osc.edu/hpc/manuals/ia64/docs2/c_ug_lnx.pdf[^] I don't know about you, but it seems easier to me to just use a macro rather than learn all the quirks of every compiler in the world.
Jeremy Falcon "It's a good thing to do and a tasty way to do it." - Wilford Brimley[^]