How do I always enable compiler warnings?What happens when a character variable declared but character placed in double quotes?How do you set, clear, and toggle a single bit?What are the differences between a pointer variable and a reference variable in C++?How to automatically generate a stacktrace when my program crashesImprove INSERT-per-second performance of SQLite?Undefined behavior and sequence points“Notice: Undefined variable”, “Notice: Undefined index”, and “Notice: Undefined offset” using PHPC++11 introduced a standardized memory model. What does it mean? And how is it going to affect C++ programming?Why is processing a sorted array faster than processing an unsorted array?Reference - What does this error mean in PHP?Compiling an application for use in highly radioactive environments
Why is differential privacy defined over the exponential function?
Are the definite and indefinite integrals actually two different things? Where is the flaw in my understanding?
Wrathful Smite, and the term 'Creature'
Which ping implementation is cygwin using?
Will replacing a fake visa with a different fake visa cause me problems when applying for a legal study permit?
Georgian capital letter “Ⴒ” (“tar”) in pdfLaTeX
Could the government trigger by-elections to regain a majority?
A medieval fantasy adventurer lights a torch in a 100% pure oxygen room. What happens?
Do any aircraft carry boats?
Are there take-over requests from autopilots?
Why didn't Thor use the All powerful spear instead of Stormbreaker?
Stack class in Java 8
How to progress with CPLEX/Gurobi
How accurate is the new appraisal system?
How does Vivi differ from other Black Mages?
Why would "an mule" be used instead of "a mule"?
Is it appropriate for a professor to require students to sign a non-disclosure agreement before being taught?
Is English tonal for some words, like "permit"?
Is there a "right" way to interpret a novel? If so, how do we make sure our novel is interpreted correctly?
Seized engine due to being run without oil
Is there a standard terminology for female equivalents of terms such as 'Kingdom' and if so, what are the most common terms?
Why are some Mac apps not available on AppStore?
Are scroll bars dead in 2019?
Tear out when plate making w/ a router
How do I always enable compiler warnings?
What happens when a character variable declared but character placed in double quotes?How do you set, clear, and toggle a single bit?What are the differences between a pointer variable and a reference variable in C++?How to automatically generate a stacktrace when my program crashesImprove INSERT-per-second performance of SQLite?Undefined behavior and sequence points“Notice: Undefined variable”, “Notice: Undefined index”, and “Notice: Undefined offset” using PHPC++11 introduced a standardized memory model. What does it mean? And how is it going to affect C++ programming?Why is processing a sorted array faster than processing an unsorted array?Reference - What does this error mean in PHP?Compiling an application for use in highly radioactive environments
.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty,.everyoneloves__bot-mid-leaderboard:empty margin-bottom:0;
I often hear, when compiling C and C++ programs, I should always "enable compiler warnings". Why is this necessary? How do I do that?
I also hear, I should "treat warnings as errors". Should I? How do I do that?
This post is intended as a FAQ and/or a generic duplicate target.
c++ c warnings
|
show 9 more comments
I often hear, when compiling C and C++ programs, I should always "enable compiler warnings". Why is this necessary? How do I do that?
I also hear, I should "treat warnings as errors". Should I? How do I do that?
This post is intended as a FAQ and/or a generic duplicate target.
c++ c warnings
7
these questions and their answers are so canonical that they should be on the first page (in bold or otherwise emphasized) of any decend learning book !
– engf-010
yesterday
2
I suspect the people who tell you this have never spent an hour or more looking for the one actual error in pages of meaningless warnings.
– jamesqf
22 hours ago
55
@jamesqf The point is to fix the warnings, not to just have loads of them sitting around cluttering up the output.
– David Z
19 hours ago
11
@jamesqf If you have hundreds of thousands of lines of code that have been developed over years and never compiled with warnings enabled, then yes, turning on warnings at that point is either useless, or mandates a huge catch-up effort to fix them all (no matter what) in order to make them useful again. But that's almost completely beside the point of this question, which is to encourage people to enable warnings (and take them seriously) from the beginning, so that they don't get into horrible binds like that.
– Steve Summit
13 hours ago
4
@jamesqf And what you should do in that case is so called "maintenance" of the code to keep it to current specs. ;) Obviously there are codebases where this isn't thought to be economical.
– Nobody
9 hours ago
|
show 9 more comments
I often hear, when compiling C and C++ programs, I should always "enable compiler warnings". Why is this necessary? How do I do that?
I also hear, I should "treat warnings as errors". Should I? How do I do that?
This post is intended as a FAQ and/or a generic duplicate target.
c++ c warnings
I often hear, when compiling C and C++ programs, I should always "enable compiler warnings". Why is this necessary? How do I do that?
I also hear, I should "treat warnings as errors". Should I? How do I do that?
This post is intended as a FAQ and/or a generic duplicate target.
c++ c warnings
c++ c warnings
edited 1 hour ago
JL2210
3,8684 gold badges12 silver badges41 bronze badges
3,8684 gold badges12 silver badges41 bronze badges
asked yesterday
n.m.n.m.
77.6k9 gold badges93 silver badges180 bronze badges
77.6k9 gold badges93 silver badges180 bronze badges
7
these questions and their answers are so canonical that they should be on the first page (in bold or otherwise emphasized) of any decend learning book !
– engf-010
yesterday
2
I suspect the people who tell you this have never spent an hour or more looking for the one actual error in pages of meaningless warnings.
– jamesqf
22 hours ago
55
@jamesqf The point is to fix the warnings, not to just have loads of them sitting around cluttering up the output.
– David Z
19 hours ago
11
@jamesqf If you have hundreds of thousands of lines of code that have been developed over years and never compiled with warnings enabled, then yes, turning on warnings at that point is either useless, or mandates a huge catch-up effort to fix them all (no matter what) in order to make them useful again. But that's almost completely beside the point of this question, which is to encourage people to enable warnings (and take them seriously) from the beginning, so that they don't get into horrible binds like that.
– Steve Summit
13 hours ago
4
@jamesqf And what you should do in that case is so called "maintenance" of the code to keep it to current specs. ;) Obviously there are codebases where this isn't thought to be economical.
– Nobody
9 hours ago
|
show 9 more comments
7
these questions and their answers are so canonical that they should be on the first page (in bold or otherwise emphasized) of any decend learning book !
– engf-010
yesterday
2
I suspect the people who tell you this have never spent an hour or more looking for the one actual error in pages of meaningless warnings.
– jamesqf
22 hours ago
55
@jamesqf The point is to fix the warnings, not to just have loads of them sitting around cluttering up the output.
– David Z
19 hours ago
11
@jamesqf If you have hundreds of thousands of lines of code that have been developed over years and never compiled with warnings enabled, then yes, turning on warnings at that point is either useless, or mandates a huge catch-up effort to fix them all (no matter what) in order to make them useful again. But that's almost completely beside the point of this question, which is to encourage people to enable warnings (and take them seriously) from the beginning, so that they don't get into horrible binds like that.
– Steve Summit
13 hours ago
4
@jamesqf And what you should do in that case is so called "maintenance" of the code to keep it to current specs. ;) Obviously there are codebases where this isn't thought to be economical.
– Nobody
9 hours ago
7
7
these questions and their answers are so canonical that they should be on the first page (in bold or otherwise emphasized) of any decend learning book !
– engf-010
yesterday
these questions and their answers are so canonical that they should be on the first page (in bold or otherwise emphasized) of any decend learning book !
– engf-010
yesterday
2
2
I suspect the people who tell you this have never spent an hour or more looking for the one actual error in pages of meaningless warnings.
– jamesqf
22 hours ago
I suspect the people who tell you this have never spent an hour or more looking for the one actual error in pages of meaningless warnings.
– jamesqf
22 hours ago
55
55
@jamesqf The point is to fix the warnings, not to just have loads of them sitting around cluttering up the output.
– David Z
19 hours ago
@jamesqf The point is to fix the warnings, not to just have loads of them sitting around cluttering up the output.
– David Z
19 hours ago
11
11
@jamesqf If you have hundreds of thousands of lines of code that have been developed over years and never compiled with warnings enabled, then yes, turning on warnings at that point is either useless, or mandates a huge catch-up effort to fix them all (no matter what) in order to make them useful again. But that's almost completely beside the point of this question, which is to encourage people to enable warnings (and take them seriously) from the beginning, so that they don't get into horrible binds like that.
– Steve Summit
13 hours ago
@jamesqf If you have hundreds of thousands of lines of code that have been developed over years and never compiled with warnings enabled, then yes, turning on warnings at that point is either useless, or mandates a huge catch-up effort to fix them all (no matter what) in order to make them useful again. But that's almost completely beside the point of this question, which is to encourage people to enable warnings (and take them seriously) from the beginning, so that they don't get into horrible binds like that.
– Steve Summit
13 hours ago
4
4
@jamesqf And what you should do in that case is so called "maintenance" of the code to keep it to current specs. ;) Obviously there are codebases where this isn't thought to be economical.
– Nobody
9 hours ago
@jamesqf And what you should do in that case is so called "maintenance" of the code to keep it to current specs. ;) Obviously there are codebases where this isn't thought to be economical.
– Nobody
9 hours ago
|
show 9 more comments
12 Answers
12
active
oldest
votes
Why enable warnings?
C and C++ compilers are notoriously bad at reporting some common programmer mistakes by default, such as:
- forgetting to initialise a variable
- forgetting to
return
a value from a function - arguments in
printf
andscanf
families not matching the format string - a function is used without being declared beforehand (C only)
These can be detected and reported, just usually not by default; this feature must be explicitly requested via compiler options.
How to enable warnings?
This depends on your compiler.
Microsoft C and C++ compilers understand switches like /W1
, /W2
, /W3
, /W4
and /Wall
. Use at least /W3
. /W4
and /Wall
may emit spurious warnings for system header files, but if your project compiles cleanly with one of these options, go for it. These options are mutually exclusive.
Most other compilers understand options like -Wall
, -Wpedantic
and -Wextra
. -Wall
is essential and all the rest are recommended. These options can be used separately or all together.
Your IDE may have a way to enable these from the user interface.
Why treat warnings as errors? They are just warnings!
A compiler warning signals a potentially serious problem in your code. The problems listed above are almost always fatal; others may or may not be, but you want compilation to fail even if it turns out to be a false alarm. Investigate each warning, find the root cause, and fix it. In the case of a false alarm, work around it — that is, use a different language feature or construct so that the warning is no longer triggered. If this proves to be very hard, disable that particular warning on a case by case basis.
You don't want to just leave warnings as warnings even if all of them are false alarms. It could be OK for very small projects where the total number of warnings emitted is less than 7. Anything more, and it's easy for a new warning to get lost in a flood of old familiar ones. Don't allow that. Just cause all your project to compile cleanly.
How to treat warnings as errors?
This is again done with compiler switches. /WX
is for Microsoft, most others use -Werror
. In either case, the compilation will fail if there are any warnings produced.
27
I have posted this Q&A because I'm sick and tired of telling people to enable warnings. Now I can just point them here (or, if I'm in a particularly evil mood, close their question as a dupe). You are welcome to improve this answer or add your own!
– n.m.
yesterday
4
You can also use clang's -Weverything
– pmg
yesterday
4
The only modifier I would add is that some warnings may not be helpful for your application. (I've seen warnings that the compiler added 2 bytes of padding between elements in a struct. The application was for prototyping, so a little wasted memory didn't bother us.) Treat all warnings as errors and then only disable a warning if you know why that warning won't help you.
– Kyle A
yesterday
1
@JMAA Language errors are normally reported as errors without any special flags.
– n.m.
yesterday
4
The downside of treating warnings as errors for people following your default build instructions is that your code rots as compilers add new warnings. Users who download your code and try to build it in the future may be unable to, because their compiler is too new and issues a warning about some extra parentheses or something that your compiler didn't care about. The user who encounters the error isn't responsible for your code or your build system, and has no idea how to turn off treating warnings as errors and actually build your project.
– interfect
5 hours ago
|
show 10 more comments
C is, famously, a rather low-level language as HLLs go. C++, though it might seem to be a considerably higher-level language than C, still shares a number of its traits. And one of those traits is that the languages were designed by programmers, for programmers -- and, specifically, programmers who knew what they were doing.
[For the rest of this answer I'm going to focus on C. Most of what I'll say also applies to C++, though perhaps not as strongly. Although as Bjarne Stroustrup has famously said, "C makes it easy to shoot yourself in the foot; C++ makes it harder, but when you do it blows your whole leg off."]
If you know what you are doing -- really know what you are doing -- sometimes you may have to "break the rules". But most of the time, most of us will agree that well-intentioned rules keep us all out of trouble, and that wantonly breaking those rules all the time is a bad idea.
But in C and C++, there are surprisingly large numbers of things you can do that are "bad ideas" but which aren't formally "against the rules". Sometimes they're a bad idea some of the time (but might be defensible other times); sometimes they're a bad idea virtually all of the time. But the tradition has always been not to warn about these things -- because, again, the assumption is that programmers know what they are doing, they wouldn't be doing these things without a good reason, they'd be annoyed by a bunch of unnecessary warnings.
But of course not all programmers really know what they're doing. And, in particular, every C programmer (no matter how experienced) goes through a phase of being a beginning C programmer. And even experienced C programmers can get careless and make mistakes.
Finally, experience has shown not only that programmers do make mistakes, but that these mistakes can have real, serious consequences. If you make a mistake, and the compiler doesn't warn you about it, and somehow the program doesn't immediately crash or do something obviously wrong because of it, the mistake can lurk there, hidden, sometimes for years, until it causes a really big problem.
So it turns out that, most of the time, warnings are a good idea, after all. Even the experienced programmers have learned (actually, it's "especially the experienced programmers have learned") that, on balance, the warnings tend to do more good than harm. For every time you did something wrong deliberately and the warning was a nuisance, there are probably at least ten times you did something wrong by accident and the warning saved you from further trouble. And most warnings can be disabled or worked around for those few times when you really want to do the "wrong" thing.
(A classic example of such a "mistake" is the test if(a = b)
. Most of the time, this is a mistake, so most compilers these days warn about it -- some even by default. But if you really wanted to both assign b
to a
and test the result, you can disable the warning by typing if((a = b))
.)
The second question is, why would you want to ask the compiler to treat warnings as errors? I'd say it's because of human nature, specifically, the all-too-easy reaction of saying "Oh, that's just a warning, that's not so important, I'll clean that up later." But if you're a procrastinator (and I don't know about you, but I'm a terrible procrastinator) it's easy to put off the necessarily cleanup for basically ever -- and if you get into the habit of ignoring warnings, it gets easier and easier to miss an important warning message that's sitting there, unnoticed, in the midst of all the ones you're ignoring.
So asking the compiler to treat warnings as errors is a little trick you can play on yourself to get around this human foible.
Personally, I'm not as insistent about treating warnings as errors. (In fact, if I'm honest, I can say that I virtually never enable that options in my "personal" programming.) But you can be sure I've got that option enabled at work, where our style guide (which I wrote) mandates its use. And I would say -- I suspect most professional programmers would say -- that any shop that doesn't treat warnings as errors in C is behaving irresponsibly, is not adhering to commonly-accepted industry best practices.
1
"programmers who knew what they were doing" - LOL; there's a "no true Scotsman" fallacy if ever I saw one :)
– Dancrumb
9 hours ago
2
@Dancrumb LOL back atcha. I'm never quite sure I understand the No true Scotsman fallacy, but I like it, so this'll be a good exercise for me. I guess the application here is like this: "No C programmer would ever writeif(a = b)
, therefore we don't need to warn about it." (Then someone produces a list of 10 critical bugs in 10 released products that result from this particular error.) "Okay, no experienced C programmer would ever write that..."
– Steve Summit
8 hours ago
@SteveSummit but a really experienced C programmer may writeif (returnCodeFromFoo = foo(bar))
and mean it, to capture and test the code in one place (Assume the only purpose offoo
is to have side effects!) The fact that a really really experienced programmer may know this is not a good coding style is beside the point ;)
– alephzero
6 hours ago
The thing is, most very experienced programmers enable most, if not all, warnings. If they do want to use something likeif (returnCodeFromFoo = foo(bar))
, then they put a comment in and turn off the warning (so that when the maintenance programmer looks at it 4 years later, he/she will realize that the code is intentional. That said, I worked with someone who in (in Microsoft C++ land) insisted that combining /Wall with treating warnings as errors was the way to go. Uh, it isn't (unless you want to put in a lot of suppression comments).
– Flydog57
5 hours ago
add a comment |
Warnings consist of the best advice some of the most skilled C++ developers could bake into an application. They're worth keeping around.
C++, being a Turing complete language, has plenty of cases where the compiler must simply trust that you knew what you are doing. However, there are many cases where the compiler can realize that you probably did not intend to write what you wrote. A classic example is printf() codes which don't match the arguments, or std::strings passed to printf (not that that ever happens to me!). In these cases, the code you wrote is not an error. It is a valid C++ expression with a valid interpretation for the compiler to act on. But the compiler has a strong hunch that you simply overlooked something which is easy for a modern compiler to detect. These are warnings. They are things that are obvious to a compiler, using all the strict rules of C++ at its disposal, that you might have overlooked.
Turning warnings off, or ignoring them, is like choosing to ignore free advice from those more skilled than you. Its a lesson in huberis that ends either when you fly too close to the sun and your wings melt, or a memory corruption error occurs. Between the two, I'll take falling from the sky any day!
"Treat warnings as errors" is the extreme version of this philosophy. The idea here is that you resolve every warning the compiler gives you -- you listen to every bit of free advice and act on it. Whether this is a good model for development for you depends on the team and what kind of product you are working on. It's the ascetic approach that a monk might have. For some, it works great. For others, it does not.
On many of my applications we do not treat warnings as errors. We do this because these particular applications need to compile on several platforms with several compilers of varying ages. Sometimes we find it is actually impossible to fix a warning on one side without it turning into a warning on another platform. So we are merely careful. We respect warnings, but we don't bend over backwards for them.
2
What has C++ being Turing complete have to do with that. A lot of languages are turing complete and do not trust you if you do something wrong....
– Kami Kaze
18 hours ago
1
@KamiKaze every language will have idiomatic mistakes (e.g. Java can't stop you writing an inconsistentequals
/hashCode
), and it's a quality of implementation issue which of those are reported.
– Caleth
15 hours ago
@KamiKaze The Turing completeness bit comes in to show that there are cases where the compiler cannot prove that your code will fail to work as planned. This is important because compilers cannot make all "wrong" code an error. Errors can only be reserved for behaviors that the language designers are certain will always be "wrong." (typically because it leads down paths that are inconsistent).
– Cort Ammon
4 hours ago
Which also points to the challenge with "all warnings are errors." Warnings are, by design, more opportunistic, triggering on some potentially correct code in exchange for triggering on wrong code more often. Warnings as errors leads to you not being able to exercise the full language's capabilities.
– Cort Ammon
4 hours ago
add a comment |
Not only does handling the warnings make better code, it makes you a better programmer. Warnings will tell you about things that may seem little to you today, but one day that bad habit will come back and bite your head off.
Use the correct type, return that value, evaluate that return value. Take time and reflect "Is this really the correct type in this context?" "Do I need to return this?" And the biggie; "Is this code going to be portable for the next 10 years?"
Get into the habit of writing warning-free code in the first place.
add a comment |
Non-fixed warnings will, sooner or later, lead to errors in your code.
Debugging a segmentation fault, for instance, requires the programmer to trace the root (cause) of the fault, which usually is located in a prior place in your code than the line that eventually caused the segmentation fault.
It's very typical that the cause is a line for which the compiler had issued a warning that you ignored, and the line that caused the segmentation fault the line that eventually threw the error.
Fixing the warning leads to fixing the problem.. A classic!
A demonstration of the above.. Consider the following code:
#include <stdio.h>
int main(void)
char* str = "Hello world!";
int idx;
// Colossal amount of code here, irrelevant to 'idx'
printf("%cn", str[idx]);
return 0;
which when compiled with "Wextra" flag passed to GCC, gives:
main.c: In function 'main':
main.c:9:21: warning: 'idx' is used uninitialized in this function [-Wuninitialized]
9 | printf("%cn", str[idx]);
| ^
which I could ignore and execute the code anyway.. And then I would witness a "grand" segmentation fault, as my IP epicurus professor used to say:
Segmentation fault
In order to debug this in a real world scenario, one would start from the line that causes the segmentation fault and attempt to trace what is the root of the cause.. They would have to search for what has happened to i
and str
inside that colossal amount of code over there...
Until, one day, they found theirselves in the situation where the discover that i
is used uninitialized, thus it has a garbage value, which results in indexing the string (way) beyond out of its bounds, which leads to a segmentation fault.
If only they hadn't ignored the warning, they would have found the bug immediately!
add a comment |
You should always enable compiler warnings because the compiler can often tell you what's wrong with your code. To do this, you pass -Wall -Wextra
to the compiler.
You should usually treat warnings as errors because often the warnings usually signify that there's something wrong with your code. However, it's often very easy to ignore these errors. Therefore, treating them as errors will cause the build to fail so you can't ignore the errors. To treat warnings as errors, pass -Werror
to the compiler.
add a comment |
Some warning may mean possible semantic error in code or possible UB. E.g. ;
after if()
, unused variable, global variable masked by local, or comparison of signed and unsigned. Many warnings are related to static code analyzer in compiler or to breaches of ISO standard detectable at compile time, which "require diagnostics". While those occurrences may be legal in one particular case, they would be result of design issues most of time.
Some compilers, e.g. gcc, have a command line option to activate "warnings as errors" mode, it's a nice , if cruel, tool to educate novice coders.
add a comment |
Treating warnings as errors is just a mean of self-discipline: you were compiling a program to test that shiny new feature, but you can't until you fix the sloppy parts. There is no additional information Werror
provides, it just sets priorities very clearly:
Don't add new code until you fix problems in the existing code
It's really the mindset that's important, not the tools. Compiler diagnostics output is a tool. MISRA (for embedded C) is another tool. It doesn't matter which one you use, but arguably compiler warnings is the easiest tool you can get (it's just one flag to set) and the signal to noise ratio is very high. So there's no reason not to use it.
No tool is infallible. If you write const float pi = 3.14;
, no tool will tell you that you defined π with a bad precision which may lead to problems down the road. Most tools won't raise an eyebrow on if(tmp < 42)
, even if it's commonly known that giving variables meaningless names and using magic numbers is a way to disaster in big projects. You have to understand that any "quick test" code you write is just that: a test, and you have to get it right before you move on to other tasks, while you still see its shortcomings. If you leave that codes as is, debugging if after you spend two months adding new features will be significantly harder.
Once you get into the right mindset, there is no point in using Werror
. Having warnings as warnings will allow you to take an informed decision whether it still makes sense to run that debug session you were about to start, or to abort it and fix the warnings first.
add a comment |
The other answers are excellent and I don't want to repeat what they have said.
One other aspect to "why enable warnings" that hasn't properly been touched on is that they help enormously with code maintenance. When you write a program of significant size, it becomes impossible to keep the whole thing in your head at once. You typically have a function or three that you're actively writing and thinking about, and perhaps a file or three on your screen that you can refer to, but the bulk of the program exists in the background somewhere and you have to trust that it keeps working.
Having warnings on, and having them as energetic and in your face as possible, helps to alert you if something you change makes trouble for something that you can't see.
Take for example, the clang warning -Wswitch-enum
. That triggers a warning if you use a switch on an enum and miss out one of the possible enum values. It's something you might think would be an unlikely mistake to make: you probably at least looked at the list of enum values when you wrote the switch statement. You might even have an IDE that generated the switch options for you, leaving no room for human error.
This warning really comes into its own when, six months later you add another possible entry to the enum. Again, if you're thinking about the code in question you'll probably be fine. But if this enum is used for multiple different purposes and it's for one of those that you need the extra option, it's very easy to forget to update a switch in a file you haven't touched for 6 months.
You can think of warnings in the same way as you'd think of automated test cases: they help you make sure that the code is sensible and doing what you need when you first write it, but they help even more to make sure that it keeps doing what you need while you prod at it. The difference is that test cases work very narrowly to the requirements of your code and you have to write them, while warnings work broadly to sensible standards for almost all code, and they're very generously supplied by the boffins who make the compilers.
add a comment |
As someone who works with legacy embedded C code, enabling compiler warnings has helped show a lot of weakness and areas to investigate when proposing fixes. In gcc utilizing -Wall
and -Wextra
and even -Wshadow
have become vital. I'm not going to go every single hazard, but I'll list a few that have popped up that helped show code issues.
Variables being left behind
This one can easily point to unfinished work and areas that might not be utilizing all of the passed variables which could be an issue. Let's look at a simple function that may trigger this:
int foo(int a, int b)
int c = 0;
if (a > 0)
return a;
return 0;
Just compiling this without -Wall or -Wextra returns no issues. -Wall will tell you though that c
is never used:
foo.c: In function ‘foo’:
foo.c:9:20: warning: unused variable ‘c’
[-Wunused-variable]
-Wextra will also tell you that your parameter b doesn't do anything:
foo.c: In function ‘foo’:
foo.c:9:20: warning: unused variable ‘c’
[-Wunused-variable]
foo.c:7:20: warning: unused parameter ‘b’ [-Wunused-parameter] int foo(int a, int b)
Global Variable shadowing
This one bit hard and did not show up until -Wshadow
was used. Let's modify the example above to just add, but there just happens to be a global with the same name as a local which causes a lot of confusion when trying to use both.
int c = 7;
int foo(int a, int b)
int c = a + b;
return c;
When -Wshadow was turned on, it's easy to spot this issue.
foo.c:11:9: warning: declaration of ‘c’ shadows a global declaration
[-Wshadow]
foo.c:1:5: note: shadowed declaration is here
Format strings
This doesn't require any extra flags in gcc, but it has still be the source of problems in the past. A simple function trying to print data, but has a formatting error could look like this:
void foo(const char * str)
printf("str = %dn", str);
This doesn't print the string since the formatting flag is wrong and gcc will happily tell you this is probably not what you wanted:
foo.c: In function ‘foo’:
foo.c:10:12: warning: format ‘%d’ expects
argument of type ‘int’, but argument 2 has type ‘const char *’
[-Wformat=]
These are just three of the many things the compiler can double check for you. There are a lot of others like using an uninitialized variable that others have pointed out.
add a comment |
This is a specific answer to C, and why this is far more important to C than to anything else.
#include <stdio.h>
int main()
FILE *fp = "some string";
This code compiles with a warning. What are and should be errors in just about every other language on the planet (barring assembly language) are warnings in C. Warnings in C are almost always errors in disguise. Warnings should be fixed, not suppressed.
With gcc
, we do this as gcc -Wall -Werror
.
This was also the reason for the high rantyness about some MS non-secure API warnings. Most people programming C have learned the hard way to treat warnings as errors and this stuff appeared that just wasn't the same kind of thing and wanted non-portable fixes.
add a comment |
Take it easy: you don't have to, it is not necessary. -Wall and -Werror was designed by code-refactoring maniacs for themselves. But other people who hates when others brake their code also find it useful.
So, this is just an optional feature, which is totally up to your preference to use or not.
I use it all the time because it helps to fix my mistakes.
10
Although it is not mandatory, it is highly recommended to use them
– Spikatrix
21 hours ago
2
-Wall and -Werror was designed by code-refactoring maniacs for themselves.
[citation needed]
– YSC
10 hours ago
2
It seems like you're contradicting yourself. If you "use it all the time because it helps to fix [your] mistakes," isn't it worth teaching to newer programmers so that they'll be doing it everywhere from the get go? I don't think this question is asking whether or not it's possible to compile without-Wall
and-Werror
, it's just asking if it's a good idea. Which, from your last sentence, it sounds like you're saying it is.
– scohe001
5 hours ago
add a comment |
Your Answer
StackExchange.ifUsing("editor", function ()
StackExchange.using("externalEditor", function ()
StackExchange.using("snippets", function ()
StackExchange.snippets.init();
);
);
, "code-snippets");
StackExchange.ready(function()
var channelOptions =
tags: "".split(" "),
id: "1"
;
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function()
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled)
StackExchange.using("snippets", function()
createEditor();
);
else
createEditor();
);
function createEditor()
StackExchange.prepareEditor(
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader:
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/4.0/"u003ecc by-sa 4.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
,
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
);
);
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f57842756%2fhow-do-i-always-enable-compiler-warnings%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
12 Answers
12
active
oldest
votes
12 Answers
12
active
oldest
votes
active
oldest
votes
active
oldest
votes
Why enable warnings?
C and C++ compilers are notoriously bad at reporting some common programmer mistakes by default, such as:
- forgetting to initialise a variable
- forgetting to
return
a value from a function - arguments in
printf
andscanf
families not matching the format string - a function is used without being declared beforehand (C only)
These can be detected and reported, just usually not by default; this feature must be explicitly requested via compiler options.
How to enable warnings?
This depends on your compiler.
Microsoft C and C++ compilers understand switches like /W1
, /W2
, /W3
, /W4
and /Wall
. Use at least /W3
. /W4
and /Wall
may emit spurious warnings for system header files, but if your project compiles cleanly with one of these options, go for it. These options are mutually exclusive.
Most other compilers understand options like -Wall
, -Wpedantic
and -Wextra
. -Wall
is essential and all the rest are recommended. These options can be used separately or all together.
Your IDE may have a way to enable these from the user interface.
Why treat warnings as errors? They are just warnings!
A compiler warning signals a potentially serious problem in your code. The problems listed above are almost always fatal; others may or may not be, but you want compilation to fail even if it turns out to be a false alarm. Investigate each warning, find the root cause, and fix it. In the case of a false alarm, work around it — that is, use a different language feature or construct so that the warning is no longer triggered. If this proves to be very hard, disable that particular warning on a case by case basis.
You don't want to just leave warnings as warnings even if all of them are false alarms. It could be OK for very small projects where the total number of warnings emitted is less than 7. Anything more, and it's easy for a new warning to get lost in a flood of old familiar ones. Don't allow that. Just cause all your project to compile cleanly.
How to treat warnings as errors?
This is again done with compiler switches. /WX
is for Microsoft, most others use -Werror
. In either case, the compilation will fail if there are any warnings produced.
27
I have posted this Q&A because I'm sick and tired of telling people to enable warnings. Now I can just point them here (or, if I'm in a particularly evil mood, close their question as a dupe). You are welcome to improve this answer or add your own!
– n.m.
yesterday
4
You can also use clang's -Weverything
– pmg
yesterday
4
The only modifier I would add is that some warnings may not be helpful for your application. (I've seen warnings that the compiler added 2 bytes of padding between elements in a struct. The application was for prototyping, so a little wasted memory didn't bother us.) Treat all warnings as errors and then only disable a warning if you know why that warning won't help you.
– Kyle A
yesterday
1
@JMAA Language errors are normally reported as errors without any special flags.
– n.m.
yesterday
4
The downside of treating warnings as errors for people following your default build instructions is that your code rots as compilers add new warnings. Users who download your code and try to build it in the future may be unable to, because their compiler is too new and issues a warning about some extra parentheses or something that your compiler didn't care about. The user who encounters the error isn't responsible for your code or your build system, and has no idea how to turn off treating warnings as errors and actually build your project.
– interfect
5 hours ago
|
show 10 more comments
Why enable warnings?
C and C++ compilers are notoriously bad at reporting some common programmer mistakes by default, such as:
- forgetting to initialise a variable
- forgetting to
return
a value from a function - arguments in
printf
andscanf
families not matching the format string - a function is used without being declared beforehand (C only)
These can be detected and reported, just usually not by default; this feature must be explicitly requested via compiler options.
How to enable warnings?
This depends on your compiler.
Microsoft C and C++ compilers understand switches like /W1
, /W2
, /W3
, /W4
and /Wall
. Use at least /W3
. /W4
and /Wall
may emit spurious warnings for system header files, but if your project compiles cleanly with one of these options, go for it. These options are mutually exclusive.
Most other compilers understand options like -Wall
, -Wpedantic
and -Wextra
. -Wall
is essential and all the rest are recommended. These options can be used separately or all together.
Your IDE may have a way to enable these from the user interface.
Why treat warnings as errors? They are just warnings!
A compiler warning signals a potentially serious problem in your code. The problems listed above are almost always fatal; others may or may not be, but you want compilation to fail even if it turns out to be a false alarm. Investigate each warning, find the root cause, and fix it. In the case of a false alarm, work around it — that is, use a different language feature or construct so that the warning is no longer triggered. If this proves to be very hard, disable that particular warning on a case by case basis.
You don't want to just leave warnings as warnings even if all of them are false alarms. It could be OK for very small projects where the total number of warnings emitted is less than 7. Anything more, and it's easy for a new warning to get lost in a flood of old familiar ones. Don't allow that. Just cause all your project to compile cleanly.
How to treat warnings as errors?
This is again done with compiler switches. /WX
is for Microsoft, most others use -Werror
. In either case, the compilation will fail if there are any warnings produced.
27
I have posted this Q&A because I'm sick and tired of telling people to enable warnings. Now I can just point them here (or, if I'm in a particularly evil mood, close their question as a dupe). You are welcome to improve this answer or add your own!
– n.m.
yesterday
4
You can also use clang's -Weverything
– pmg
yesterday
4
The only modifier I would add is that some warnings may not be helpful for your application. (I've seen warnings that the compiler added 2 bytes of padding between elements in a struct. The application was for prototyping, so a little wasted memory didn't bother us.) Treat all warnings as errors and then only disable a warning if you know why that warning won't help you.
– Kyle A
yesterday
1
@JMAA Language errors are normally reported as errors without any special flags.
– n.m.
yesterday
4
The downside of treating warnings as errors for people following your default build instructions is that your code rots as compilers add new warnings. Users who download your code and try to build it in the future may be unable to, because their compiler is too new and issues a warning about some extra parentheses or something that your compiler didn't care about. The user who encounters the error isn't responsible for your code or your build system, and has no idea how to turn off treating warnings as errors and actually build your project.
– interfect
5 hours ago
|
show 10 more comments
Why enable warnings?
C and C++ compilers are notoriously bad at reporting some common programmer mistakes by default, such as:
- forgetting to initialise a variable
- forgetting to
return
a value from a function - arguments in
printf
andscanf
families not matching the format string - a function is used without being declared beforehand (C only)
These can be detected and reported, just usually not by default; this feature must be explicitly requested via compiler options.
How to enable warnings?
This depends on your compiler.
Microsoft C and C++ compilers understand switches like /W1
, /W2
, /W3
, /W4
and /Wall
. Use at least /W3
. /W4
and /Wall
may emit spurious warnings for system header files, but if your project compiles cleanly with one of these options, go for it. These options are mutually exclusive.
Most other compilers understand options like -Wall
, -Wpedantic
and -Wextra
. -Wall
is essential and all the rest are recommended. These options can be used separately or all together.
Your IDE may have a way to enable these from the user interface.
Why treat warnings as errors? They are just warnings!
A compiler warning signals a potentially serious problem in your code. The problems listed above are almost always fatal; others may or may not be, but you want compilation to fail even if it turns out to be a false alarm. Investigate each warning, find the root cause, and fix it. In the case of a false alarm, work around it — that is, use a different language feature or construct so that the warning is no longer triggered. If this proves to be very hard, disable that particular warning on a case by case basis.
You don't want to just leave warnings as warnings even if all of them are false alarms. It could be OK for very small projects where the total number of warnings emitted is less than 7. Anything more, and it's easy for a new warning to get lost in a flood of old familiar ones. Don't allow that. Just cause all your project to compile cleanly.
How to treat warnings as errors?
This is again done with compiler switches. /WX
is for Microsoft, most others use -Werror
. In either case, the compilation will fail if there are any warnings produced.
Why enable warnings?
C and C++ compilers are notoriously bad at reporting some common programmer mistakes by default, such as:
- forgetting to initialise a variable
- forgetting to
return
a value from a function - arguments in
printf
andscanf
families not matching the format string - a function is used without being declared beforehand (C only)
These can be detected and reported, just usually not by default; this feature must be explicitly requested via compiler options.
How to enable warnings?
This depends on your compiler.
Microsoft C and C++ compilers understand switches like /W1
, /W2
, /W3
, /W4
and /Wall
. Use at least /W3
. /W4
and /Wall
may emit spurious warnings for system header files, but if your project compiles cleanly with one of these options, go for it. These options are mutually exclusive.
Most other compilers understand options like -Wall
, -Wpedantic
and -Wextra
. -Wall
is essential and all the rest are recommended. These options can be used separately or all together.
Your IDE may have a way to enable these from the user interface.
Why treat warnings as errors? They are just warnings!
A compiler warning signals a potentially serious problem in your code. The problems listed above are almost always fatal; others may or may not be, but you want compilation to fail even if it turns out to be a false alarm. Investigate each warning, find the root cause, and fix it. In the case of a false alarm, work around it — that is, use a different language feature or construct so that the warning is no longer triggered. If this proves to be very hard, disable that particular warning on a case by case basis.
You don't want to just leave warnings as warnings even if all of them are false alarms. It could be OK for very small projects where the total number of warnings emitted is less than 7. Anything more, and it's easy for a new warning to get lost in a flood of old familiar ones. Don't allow that. Just cause all your project to compile cleanly.
How to treat warnings as errors?
This is again done with compiler switches. /WX
is for Microsoft, most others use -Werror
. In either case, the compilation will fail if there are any warnings produced.
edited 12 hours ago
answered yesterday
n.m.n.m.
77.6k9 gold badges93 silver badges180 bronze badges
77.6k9 gold badges93 silver badges180 bronze badges
27
I have posted this Q&A because I'm sick and tired of telling people to enable warnings. Now I can just point them here (or, if I'm in a particularly evil mood, close their question as a dupe). You are welcome to improve this answer or add your own!
– n.m.
yesterday
4
You can also use clang's -Weverything
– pmg
yesterday
4
The only modifier I would add is that some warnings may not be helpful for your application. (I've seen warnings that the compiler added 2 bytes of padding between elements in a struct. The application was for prototyping, so a little wasted memory didn't bother us.) Treat all warnings as errors and then only disable a warning if you know why that warning won't help you.
– Kyle A
yesterday
1
@JMAA Language errors are normally reported as errors without any special flags.
– n.m.
yesterday
4
The downside of treating warnings as errors for people following your default build instructions is that your code rots as compilers add new warnings. Users who download your code and try to build it in the future may be unable to, because their compiler is too new and issues a warning about some extra parentheses or something that your compiler didn't care about. The user who encounters the error isn't responsible for your code or your build system, and has no idea how to turn off treating warnings as errors and actually build your project.
– interfect
5 hours ago
|
show 10 more comments
27
I have posted this Q&A because I'm sick and tired of telling people to enable warnings. Now I can just point them here (or, if I'm in a particularly evil mood, close their question as a dupe). You are welcome to improve this answer or add your own!
– n.m.
yesterday
4
You can also use clang's -Weverything
– pmg
yesterday
4
The only modifier I would add is that some warnings may not be helpful for your application. (I've seen warnings that the compiler added 2 bytes of padding between elements in a struct. The application was for prototyping, so a little wasted memory didn't bother us.) Treat all warnings as errors and then only disable a warning if you know why that warning won't help you.
– Kyle A
yesterday
1
@JMAA Language errors are normally reported as errors without any special flags.
– n.m.
yesterday
4
The downside of treating warnings as errors for people following your default build instructions is that your code rots as compilers add new warnings. Users who download your code and try to build it in the future may be unable to, because their compiler is too new and issues a warning about some extra parentheses or something that your compiler didn't care about. The user who encounters the error isn't responsible for your code or your build system, and has no idea how to turn off treating warnings as errors and actually build your project.
– interfect
5 hours ago
27
27
I have posted this Q&A because I'm sick and tired of telling people to enable warnings. Now I can just point them here (or, if I'm in a particularly evil mood, close their question as a dupe). You are welcome to improve this answer or add your own!
– n.m.
yesterday
I have posted this Q&A because I'm sick and tired of telling people to enable warnings. Now I can just point them here (or, if I'm in a particularly evil mood, close their question as a dupe). You are welcome to improve this answer or add your own!
– n.m.
yesterday
4
4
You can also use clang's -Weverything
– pmg
yesterday
You can also use clang's -Weverything
– pmg
yesterday
4
4
The only modifier I would add is that some warnings may not be helpful for your application. (I've seen warnings that the compiler added 2 bytes of padding between elements in a struct. The application was for prototyping, so a little wasted memory didn't bother us.) Treat all warnings as errors and then only disable a warning if you know why that warning won't help you.
– Kyle A
yesterday
The only modifier I would add is that some warnings may not be helpful for your application. (I've seen warnings that the compiler added 2 bytes of padding between elements in a struct. The application was for prototyping, so a little wasted memory didn't bother us.) Treat all warnings as errors and then only disable a warning if you know why that warning won't help you.
– Kyle A
yesterday
1
1
@JMAA Language errors are normally reported as errors without any special flags.
– n.m.
yesterday
@JMAA Language errors are normally reported as errors without any special flags.
– n.m.
yesterday
4
4
The downside of treating warnings as errors for people following your default build instructions is that your code rots as compilers add new warnings. Users who download your code and try to build it in the future may be unable to, because their compiler is too new and issues a warning about some extra parentheses or something that your compiler didn't care about. The user who encounters the error isn't responsible for your code or your build system, and has no idea how to turn off treating warnings as errors and actually build your project.
– interfect
5 hours ago
The downside of treating warnings as errors for people following your default build instructions is that your code rots as compilers add new warnings. Users who download your code and try to build it in the future may be unable to, because their compiler is too new and issues a warning about some extra parentheses or something that your compiler didn't care about. The user who encounters the error isn't responsible for your code or your build system, and has no idea how to turn off treating warnings as errors and actually build your project.
– interfect
5 hours ago
|
show 10 more comments
C is, famously, a rather low-level language as HLLs go. C++, though it might seem to be a considerably higher-level language than C, still shares a number of its traits. And one of those traits is that the languages were designed by programmers, for programmers -- and, specifically, programmers who knew what they were doing.
[For the rest of this answer I'm going to focus on C. Most of what I'll say also applies to C++, though perhaps not as strongly. Although as Bjarne Stroustrup has famously said, "C makes it easy to shoot yourself in the foot; C++ makes it harder, but when you do it blows your whole leg off."]
If you know what you are doing -- really know what you are doing -- sometimes you may have to "break the rules". But most of the time, most of us will agree that well-intentioned rules keep us all out of trouble, and that wantonly breaking those rules all the time is a bad idea.
But in C and C++, there are surprisingly large numbers of things you can do that are "bad ideas" but which aren't formally "against the rules". Sometimes they're a bad idea some of the time (but might be defensible other times); sometimes they're a bad idea virtually all of the time. But the tradition has always been not to warn about these things -- because, again, the assumption is that programmers know what they are doing, they wouldn't be doing these things without a good reason, they'd be annoyed by a bunch of unnecessary warnings.
But of course not all programmers really know what they're doing. And, in particular, every C programmer (no matter how experienced) goes through a phase of being a beginning C programmer. And even experienced C programmers can get careless and make mistakes.
Finally, experience has shown not only that programmers do make mistakes, but that these mistakes can have real, serious consequences. If you make a mistake, and the compiler doesn't warn you about it, and somehow the program doesn't immediately crash or do something obviously wrong because of it, the mistake can lurk there, hidden, sometimes for years, until it causes a really big problem.
So it turns out that, most of the time, warnings are a good idea, after all. Even the experienced programmers have learned (actually, it's "especially the experienced programmers have learned") that, on balance, the warnings tend to do more good than harm. For every time you did something wrong deliberately and the warning was a nuisance, there are probably at least ten times you did something wrong by accident and the warning saved you from further trouble. And most warnings can be disabled or worked around for those few times when you really want to do the "wrong" thing.
(A classic example of such a "mistake" is the test if(a = b)
. Most of the time, this is a mistake, so most compilers these days warn about it -- some even by default. But if you really wanted to both assign b
to a
and test the result, you can disable the warning by typing if((a = b))
.)
The second question is, why would you want to ask the compiler to treat warnings as errors? I'd say it's because of human nature, specifically, the all-too-easy reaction of saying "Oh, that's just a warning, that's not so important, I'll clean that up later." But if you're a procrastinator (and I don't know about you, but I'm a terrible procrastinator) it's easy to put off the necessarily cleanup for basically ever -- and if you get into the habit of ignoring warnings, it gets easier and easier to miss an important warning message that's sitting there, unnoticed, in the midst of all the ones you're ignoring.
So asking the compiler to treat warnings as errors is a little trick you can play on yourself to get around this human foible.
Personally, I'm not as insistent about treating warnings as errors. (In fact, if I'm honest, I can say that I virtually never enable that options in my "personal" programming.) But you can be sure I've got that option enabled at work, where our style guide (which I wrote) mandates its use. And I would say -- I suspect most professional programmers would say -- that any shop that doesn't treat warnings as errors in C is behaving irresponsibly, is not adhering to commonly-accepted industry best practices.
1
"programmers who knew what they were doing" - LOL; there's a "no true Scotsman" fallacy if ever I saw one :)
– Dancrumb
9 hours ago
2
@Dancrumb LOL back atcha. I'm never quite sure I understand the No true Scotsman fallacy, but I like it, so this'll be a good exercise for me. I guess the application here is like this: "No C programmer would ever writeif(a = b)
, therefore we don't need to warn about it." (Then someone produces a list of 10 critical bugs in 10 released products that result from this particular error.) "Okay, no experienced C programmer would ever write that..."
– Steve Summit
8 hours ago
@SteveSummit but a really experienced C programmer may writeif (returnCodeFromFoo = foo(bar))
and mean it, to capture and test the code in one place (Assume the only purpose offoo
is to have side effects!) The fact that a really really experienced programmer may know this is not a good coding style is beside the point ;)
– alephzero
6 hours ago
The thing is, most very experienced programmers enable most, if not all, warnings. If they do want to use something likeif (returnCodeFromFoo = foo(bar))
, then they put a comment in and turn off the warning (so that when the maintenance programmer looks at it 4 years later, he/she will realize that the code is intentional. That said, I worked with someone who in (in Microsoft C++ land) insisted that combining /Wall with treating warnings as errors was the way to go. Uh, it isn't (unless you want to put in a lot of suppression comments).
– Flydog57
5 hours ago
add a comment |
C is, famously, a rather low-level language as HLLs go. C++, though it might seem to be a considerably higher-level language than C, still shares a number of its traits. And one of those traits is that the languages were designed by programmers, for programmers -- and, specifically, programmers who knew what they were doing.
[For the rest of this answer I'm going to focus on C. Most of what I'll say also applies to C++, though perhaps not as strongly. Although as Bjarne Stroustrup has famously said, "C makes it easy to shoot yourself in the foot; C++ makes it harder, but when you do it blows your whole leg off."]
If you know what you are doing -- really know what you are doing -- sometimes you may have to "break the rules". But most of the time, most of us will agree that well-intentioned rules keep us all out of trouble, and that wantonly breaking those rules all the time is a bad idea.
But in C and C++, there are surprisingly large numbers of things you can do that are "bad ideas" but which aren't formally "against the rules". Sometimes they're a bad idea some of the time (but might be defensible other times); sometimes they're a bad idea virtually all of the time. But the tradition has always been not to warn about these things -- because, again, the assumption is that programmers know what they are doing, they wouldn't be doing these things without a good reason, they'd be annoyed by a bunch of unnecessary warnings.
But of course not all programmers really know what they're doing. And, in particular, every C programmer (no matter how experienced) goes through a phase of being a beginning C programmer. And even experienced C programmers can get careless and make mistakes.
Finally, experience has shown not only that programmers do make mistakes, but that these mistakes can have real, serious consequences. If you make a mistake, and the compiler doesn't warn you about it, and somehow the program doesn't immediately crash or do something obviously wrong because of it, the mistake can lurk there, hidden, sometimes for years, until it causes a really big problem.
So it turns out that, most of the time, warnings are a good idea, after all. Even the experienced programmers have learned (actually, it's "especially the experienced programmers have learned") that, on balance, the warnings tend to do more good than harm. For every time you did something wrong deliberately and the warning was a nuisance, there are probably at least ten times you did something wrong by accident and the warning saved you from further trouble. And most warnings can be disabled or worked around for those few times when you really want to do the "wrong" thing.
(A classic example of such a "mistake" is the test if(a = b)
. Most of the time, this is a mistake, so most compilers these days warn about it -- some even by default. But if you really wanted to both assign b
to a
and test the result, you can disable the warning by typing if((a = b))
.)
The second question is, why would you want to ask the compiler to treat warnings as errors? I'd say it's because of human nature, specifically, the all-too-easy reaction of saying "Oh, that's just a warning, that's not so important, I'll clean that up later." But if you're a procrastinator (and I don't know about you, but I'm a terrible procrastinator) it's easy to put off the necessarily cleanup for basically ever -- and if you get into the habit of ignoring warnings, it gets easier and easier to miss an important warning message that's sitting there, unnoticed, in the midst of all the ones you're ignoring.
So asking the compiler to treat warnings as errors is a little trick you can play on yourself to get around this human foible.
Personally, I'm not as insistent about treating warnings as errors. (In fact, if I'm honest, I can say that I virtually never enable that options in my "personal" programming.) But you can be sure I've got that option enabled at work, where our style guide (which I wrote) mandates its use. And I would say -- I suspect most professional programmers would say -- that any shop that doesn't treat warnings as errors in C is behaving irresponsibly, is not adhering to commonly-accepted industry best practices.
1
"programmers who knew what they were doing" - LOL; there's a "no true Scotsman" fallacy if ever I saw one :)
– Dancrumb
9 hours ago
2
@Dancrumb LOL back atcha. I'm never quite sure I understand the No true Scotsman fallacy, but I like it, so this'll be a good exercise for me. I guess the application here is like this: "No C programmer would ever writeif(a = b)
, therefore we don't need to warn about it." (Then someone produces a list of 10 critical bugs in 10 released products that result from this particular error.) "Okay, no experienced C programmer would ever write that..."
– Steve Summit
8 hours ago
@SteveSummit but a really experienced C programmer may writeif (returnCodeFromFoo = foo(bar))
and mean it, to capture and test the code in one place (Assume the only purpose offoo
is to have side effects!) The fact that a really really experienced programmer may know this is not a good coding style is beside the point ;)
– alephzero
6 hours ago
The thing is, most very experienced programmers enable most, if not all, warnings. If they do want to use something likeif (returnCodeFromFoo = foo(bar))
, then they put a comment in and turn off the warning (so that when the maintenance programmer looks at it 4 years later, he/she will realize that the code is intentional. That said, I worked with someone who in (in Microsoft C++ land) insisted that combining /Wall with treating warnings as errors was the way to go. Uh, it isn't (unless you want to put in a lot of suppression comments).
– Flydog57
5 hours ago
add a comment |
C is, famously, a rather low-level language as HLLs go. C++, though it might seem to be a considerably higher-level language than C, still shares a number of its traits. And one of those traits is that the languages were designed by programmers, for programmers -- and, specifically, programmers who knew what they were doing.
[For the rest of this answer I'm going to focus on C. Most of what I'll say also applies to C++, though perhaps not as strongly. Although as Bjarne Stroustrup has famously said, "C makes it easy to shoot yourself in the foot; C++ makes it harder, but when you do it blows your whole leg off."]
If you know what you are doing -- really know what you are doing -- sometimes you may have to "break the rules". But most of the time, most of us will agree that well-intentioned rules keep us all out of trouble, and that wantonly breaking those rules all the time is a bad idea.
But in C and C++, there are surprisingly large numbers of things you can do that are "bad ideas" but which aren't formally "against the rules". Sometimes they're a bad idea some of the time (but might be defensible other times); sometimes they're a bad idea virtually all of the time. But the tradition has always been not to warn about these things -- because, again, the assumption is that programmers know what they are doing, they wouldn't be doing these things without a good reason, they'd be annoyed by a bunch of unnecessary warnings.
But of course not all programmers really know what they're doing. And, in particular, every C programmer (no matter how experienced) goes through a phase of being a beginning C programmer. And even experienced C programmers can get careless and make mistakes.
Finally, experience has shown not only that programmers do make mistakes, but that these mistakes can have real, serious consequences. If you make a mistake, and the compiler doesn't warn you about it, and somehow the program doesn't immediately crash or do something obviously wrong because of it, the mistake can lurk there, hidden, sometimes for years, until it causes a really big problem.
So it turns out that, most of the time, warnings are a good idea, after all. Even the experienced programmers have learned (actually, it's "especially the experienced programmers have learned") that, on balance, the warnings tend to do more good than harm. For every time you did something wrong deliberately and the warning was a nuisance, there are probably at least ten times you did something wrong by accident and the warning saved you from further trouble. And most warnings can be disabled or worked around for those few times when you really want to do the "wrong" thing.
(A classic example of such a "mistake" is the test if(a = b)
. Most of the time, this is a mistake, so most compilers these days warn about it -- some even by default. But if you really wanted to both assign b
to a
and test the result, you can disable the warning by typing if((a = b))
.)
The second question is, why would you want to ask the compiler to treat warnings as errors? I'd say it's because of human nature, specifically, the all-too-easy reaction of saying "Oh, that's just a warning, that's not so important, I'll clean that up later." But if you're a procrastinator (and I don't know about you, but I'm a terrible procrastinator) it's easy to put off the necessarily cleanup for basically ever -- and if you get into the habit of ignoring warnings, it gets easier and easier to miss an important warning message that's sitting there, unnoticed, in the midst of all the ones you're ignoring.
So asking the compiler to treat warnings as errors is a little trick you can play on yourself to get around this human foible.
Personally, I'm not as insistent about treating warnings as errors. (In fact, if I'm honest, I can say that I virtually never enable that options in my "personal" programming.) But you can be sure I've got that option enabled at work, where our style guide (which I wrote) mandates its use. And I would say -- I suspect most professional programmers would say -- that any shop that doesn't treat warnings as errors in C is behaving irresponsibly, is not adhering to commonly-accepted industry best practices.
C is, famously, a rather low-level language as HLLs go. C++, though it might seem to be a considerably higher-level language than C, still shares a number of its traits. And one of those traits is that the languages were designed by programmers, for programmers -- and, specifically, programmers who knew what they were doing.
[For the rest of this answer I'm going to focus on C. Most of what I'll say also applies to C++, though perhaps not as strongly. Although as Bjarne Stroustrup has famously said, "C makes it easy to shoot yourself in the foot; C++ makes it harder, but when you do it blows your whole leg off."]
If you know what you are doing -- really know what you are doing -- sometimes you may have to "break the rules". But most of the time, most of us will agree that well-intentioned rules keep us all out of trouble, and that wantonly breaking those rules all the time is a bad idea.
But in C and C++, there are surprisingly large numbers of things you can do that are "bad ideas" but which aren't formally "against the rules". Sometimes they're a bad idea some of the time (but might be defensible other times); sometimes they're a bad idea virtually all of the time. But the tradition has always been not to warn about these things -- because, again, the assumption is that programmers know what they are doing, they wouldn't be doing these things without a good reason, they'd be annoyed by a bunch of unnecessary warnings.
But of course not all programmers really know what they're doing. And, in particular, every C programmer (no matter how experienced) goes through a phase of being a beginning C programmer. And even experienced C programmers can get careless and make mistakes.
Finally, experience has shown not only that programmers do make mistakes, but that these mistakes can have real, serious consequences. If you make a mistake, and the compiler doesn't warn you about it, and somehow the program doesn't immediately crash or do something obviously wrong because of it, the mistake can lurk there, hidden, sometimes for years, until it causes a really big problem.
So it turns out that, most of the time, warnings are a good idea, after all. Even the experienced programmers have learned (actually, it's "especially the experienced programmers have learned") that, on balance, the warnings tend to do more good than harm. For every time you did something wrong deliberately and the warning was a nuisance, there are probably at least ten times you did something wrong by accident and the warning saved you from further trouble. And most warnings can be disabled or worked around for those few times when you really want to do the "wrong" thing.
(A classic example of such a "mistake" is the test if(a = b)
. Most of the time, this is a mistake, so most compilers these days warn about it -- some even by default. But if you really wanted to both assign b
to a
and test the result, you can disable the warning by typing if((a = b))
.)
The second question is, why would you want to ask the compiler to treat warnings as errors? I'd say it's because of human nature, specifically, the all-too-easy reaction of saying "Oh, that's just a warning, that's not so important, I'll clean that up later." But if you're a procrastinator (and I don't know about you, but I'm a terrible procrastinator) it's easy to put off the necessarily cleanup for basically ever -- and if you get into the habit of ignoring warnings, it gets easier and easier to miss an important warning message that's sitting there, unnoticed, in the midst of all the ones you're ignoring.
So asking the compiler to treat warnings as errors is a little trick you can play on yourself to get around this human foible.
Personally, I'm not as insistent about treating warnings as errors. (In fact, if I'm honest, I can say that I virtually never enable that options in my "personal" programming.) But you can be sure I've got that option enabled at work, where our style guide (which I wrote) mandates its use. And I would say -- I suspect most professional programmers would say -- that any shop that doesn't treat warnings as errors in C is behaving irresponsibly, is not adhering to commonly-accepted industry best practices.
edited yesterday
answered yesterday
Steve SummitSteve Summit
21.1k2 gold badges29 silver badges52 bronze badges
21.1k2 gold badges29 silver badges52 bronze badges
1
"programmers who knew what they were doing" - LOL; there's a "no true Scotsman" fallacy if ever I saw one :)
– Dancrumb
9 hours ago
2
@Dancrumb LOL back atcha. I'm never quite sure I understand the No true Scotsman fallacy, but I like it, so this'll be a good exercise for me. I guess the application here is like this: "No C programmer would ever writeif(a = b)
, therefore we don't need to warn about it." (Then someone produces a list of 10 critical bugs in 10 released products that result from this particular error.) "Okay, no experienced C programmer would ever write that..."
– Steve Summit
8 hours ago
@SteveSummit but a really experienced C programmer may writeif (returnCodeFromFoo = foo(bar))
and mean it, to capture and test the code in one place (Assume the only purpose offoo
is to have side effects!) The fact that a really really experienced programmer may know this is not a good coding style is beside the point ;)
– alephzero
6 hours ago
The thing is, most very experienced programmers enable most, if not all, warnings. If they do want to use something likeif (returnCodeFromFoo = foo(bar))
, then they put a comment in and turn off the warning (so that when the maintenance programmer looks at it 4 years later, he/she will realize that the code is intentional. That said, I worked with someone who in (in Microsoft C++ land) insisted that combining /Wall with treating warnings as errors was the way to go. Uh, it isn't (unless you want to put in a lot of suppression comments).
– Flydog57
5 hours ago
add a comment |
1
"programmers who knew what they were doing" - LOL; there's a "no true Scotsman" fallacy if ever I saw one :)
– Dancrumb
9 hours ago
2
@Dancrumb LOL back atcha. I'm never quite sure I understand the No true Scotsman fallacy, but I like it, so this'll be a good exercise for me. I guess the application here is like this: "No C programmer would ever writeif(a = b)
, therefore we don't need to warn about it." (Then someone produces a list of 10 critical bugs in 10 released products that result from this particular error.) "Okay, no experienced C programmer would ever write that..."
– Steve Summit
8 hours ago
@SteveSummit but a really experienced C programmer may writeif (returnCodeFromFoo = foo(bar))
and mean it, to capture and test the code in one place (Assume the only purpose offoo
is to have side effects!) The fact that a really really experienced programmer may know this is not a good coding style is beside the point ;)
– alephzero
6 hours ago
The thing is, most very experienced programmers enable most, if not all, warnings. If they do want to use something likeif (returnCodeFromFoo = foo(bar))
, then they put a comment in and turn off the warning (so that when the maintenance programmer looks at it 4 years later, he/she will realize that the code is intentional. That said, I worked with someone who in (in Microsoft C++ land) insisted that combining /Wall with treating warnings as errors was the way to go. Uh, it isn't (unless you want to put in a lot of suppression comments).
– Flydog57
5 hours ago
1
1
"programmers who knew what they were doing" - LOL; there's a "no true Scotsman" fallacy if ever I saw one :)
– Dancrumb
9 hours ago
"programmers who knew what they were doing" - LOL; there's a "no true Scotsman" fallacy if ever I saw one :)
– Dancrumb
9 hours ago
2
2
@Dancrumb LOL back atcha. I'm never quite sure I understand the No true Scotsman fallacy, but I like it, so this'll be a good exercise for me. I guess the application here is like this: "No C programmer would ever write
if(a = b)
, therefore we don't need to warn about it." (Then someone produces a list of 10 critical bugs in 10 released products that result from this particular error.) "Okay, no experienced C programmer would ever write that..."– Steve Summit
8 hours ago
@Dancrumb LOL back atcha. I'm never quite sure I understand the No true Scotsman fallacy, but I like it, so this'll be a good exercise for me. I guess the application here is like this: "No C programmer would ever write
if(a = b)
, therefore we don't need to warn about it." (Then someone produces a list of 10 critical bugs in 10 released products that result from this particular error.) "Okay, no experienced C programmer would ever write that..."– Steve Summit
8 hours ago
@SteveSummit but a really experienced C programmer may write
if (returnCodeFromFoo = foo(bar))
and mean it, to capture and test the code in one place (Assume the only purpose of foo
is to have side effects!) The fact that a really really experienced programmer may know this is not a good coding style is beside the point ;)– alephzero
6 hours ago
@SteveSummit but a really experienced C programmer may write
if (returnCodeFromFoo = foo(bar))
and mean it, to capture and test the code in one place (Assume the only purpose of foo
is to have side effects!) The fact that a really really experienced programmer may know this is not a good coding style is beside the point ;)– alephzero
6 hours ago
The thing is, most very experienced programmers enable most, if not all, warnings. If they do want to use something like
if (returnCodeFromFoo = foo(bar))
, then they put a comment in and turn off the warning (so that when the maintenance programmer looks at it 4 years later, he/she will realize that the code is intentional. That said, I worked with someone who in (in Microsoft C++ land) insisted that combining /Wall with treating warnings as errors was the way to go. Uh, it isn't (unless you want to put in a lot of suppression comments).– Flydog57
5 hours ago
The thing is, most very experienced programmers enable most, if not all, warnings. If they do want to use something like
if (returnCodeFromFoo = foo(bar))
, then they put a comment in and turn off the warning (so that when the maintenance programmer looks at it 4 years later, he/she will realize that the code is intentional. That said, I worked with someone who in (in Microsoft C++ land) insisted that combining /Wall with treating warnings as errors was the way to go. Uh, it isn't (unless you want to put in a lot of suppression comments).– Flydog57
5 hours ago
add a comment |
Warnings consist of the best advice some of the most skilled C++ developers could bake into an application. They're worth keeping around.
C++, being a Turing complete language, has plenty of cases where the compiler must simply trust that you knew what you are doing. However, there are many cases where the compiler can realize that you probably did not intend to write what you wrote. A classic example is printf() codes which don't match the arguments, or std::strings passed to printf (not that that ever happens to me!). In these cases, the code you wrote is not an error. It is a valid C++ expression with a valid interpretation for the compiler to act on. But the compiler has a strong hunch that you simply overlooked something which is easy for a modern compiler to detect. These are warnings. They are things that are obvious to a compiler, using all the strict rules of C++ at its disposal, that you might have overlooked.
Turning warnings off, or ignoring them, is like choosing to ignore free advice from those more skilled than you. Its a lesson in huberis that ends either when you fly too close to the sun and your wings melt, or a memory corruption error occurs. Between the two, I'll take falling from the sky any day!
"Treat warnings as errors" is the extreme version of this philosophy. The idea here is that you resolve every warning the compiler gives you -- you listen to every bit of free advice and act on it. Whether this is a good model for development for you depends on the team and what kind of product you are working on. It's the ascetic approach that a monk might have. For some, it works great. For others, it does not.
On many of my applications we do not treat warnings as errors. We do this because these particular applications need to compile on several platforms with several compilers of varying ages. Sometimes we find it is actually impossible to fix a warning on one side without it turning into a warning on another platform. So we are merely careful. We respect warnings, but we don't bend over backwards for them.
2
What has C++ being Turing complete have to do with that. A lot of languages are turing complete and do not trust you if you do something wrong....
– Kami Kaze
18 hours ago
1
@KamiKaze every language will have idiomatic mistakes (e.g. Java can't stop you writing an inconsistentequals
/hashCode
), and it's a quality of implementation issue which of those are reported.
– Caleth
15 hours ago
@KamiKaze The Turing completeness bit comes in to show that there are cases where the compiler cannot prove that your code will fail to work as planned. This is important because compilers cannot make all "wrong" code an error. Errors can only be reserved for behaviors that the language designers are certain will always be "wrong." (typically because it leads down paths that are inconsistent).
– Cort Ammon
4 hours ago
Which also points to the challenge with "all warnings are errors." Warnings are, by design, more opportunistic, triggering on some potentially correct code in exchange for triggering on wrong code more often. Warnings as errors leads to you not being able to exercise the full language's capabilities.
– Cort Ammon
4 hours ago
add a comment |
Warnings consist of the best advice some of the most skilled C++ developers could bake into an application. They're worth keeping around.
C++, being a Turing complete language, has plenty of cases where the compiler must simply trust that you knew what you are doing. However, there are many cases where the compiler can realize that you probably did not intend to write what you wrote. A classic example is printf() codes which don't match the arguments, or std::strings passed to printf (not that that ever happens to me!). In these cases, the code you wrote is not an error. It is a valid C++ expression with a valid interpretation for the compiler to act on. But the compiler has a strong hunch that you simply overlooked something which is easy for a modern compiler to detect. These are warnings. They are things that are obvious to a compiler, using all the strict rules of C++ at its disposal, that you might have overlooked.
Turning warnings off, or ignoring them, is like choosing to ignore free advice from those more skilled than you. Its a lesson in huberis that ends either when you fly too close to the sun and your wings melt, or a memory corruption error occurs. Between the two, I'll take falling from the sky any day!
"Treat warnings as errors" is the extreme version of this philosophy. The idea here is that you resolve every warning the compiler gives you -- you listen to every bit of free advice and act on it. Whether this is a good model for development for you depends on the team and what kind of product you are working on. It's the ascetic approach that a monk might have. For some, it works great. For others, it does not.
On many of my applications we do not treat warnings as errors. We do this because these particular applications need to compile on several platforms with several compilers of varying ages. Sometimes we find it is actually impossible to fix a warning on one side without it turning into a warning on another platform. So we are merely careful. We respect warnings, but we don't bend over backwards for them.
2
What has C++ being Turing complete have to do with that. A lot of languages are turing complete and do not trust you if you do something wrong....
– Kami Kaze
18 hours ago
1
@KamiKaze every language will have idiomatic mistakes (e.g. Java can't stop you writing an inconsistentequals
/hashCode
), and it's a quality of implementation issue which of those are reported.
– Caleth
15 hours ago
@KamiKaze The Turing completeness bit comes in to show that there are cases where the compiler cannot prove that your code will fail to work as planned. This is important because compilers cannot make all "wrong" code an error. Errors can only be reserved for behaviors that the language designers are certain will always be "wrong." (typically because it leads down paths that are inconsistent).
– Cort Ammon
4 hours ago
Which also points to the challenge with "all warnings are errors." Warnings are, by design, more opportunistic, triggering on some potentially correct code in exchange for triggering on wrong code more often. Warnings as errors leads to you not being able to exercise the full language's capabilities.
– Cort Ammon
4 hours ago
add a comment |
Warnings consist of the best advice some of the most skilled C++ developers could bake into an application. They're worth keeping around.
C++, being a Turing complete language, has plenty of cases where the compiler must simply trust that you knew what you are doing. However, there are many cases where the compiler can realize that you probably did not intend to write what you wrote. A classic example is printf() codes which don't match the arguments, or std::strings passed to printf (not that that ever happens to me!). In these cases, the code you wrote is not an error. It is a valid C++ expression with a valid interpretation for the compiler to act on. But the compiler has a strong hunch that you simply overlooked something which is easy for a modern compiler to detect. These are warnings. They are things that are obvious to a compiler, using all the strict rules of C++ at its disposal, that you might have overlooked.
Turning warnings off, or ignoring them, is like choosing to ignore free advice from those more skilled than you. Its a lesson in huberis that ends either when you fly too close to the sun and your wings melt, or a memory corruption error occurs. Between the two, I'll take falling from the sky any day!
"Treat warnings as errors" is the extreme version of this philosophy. The idea here is that you resolve every warning the compiler gives you -- you listen to every bit of free advice and act on it. Whether this is a good model for development for you depends on the team and what kind of product you are working on. It's the ascetic approach that a monk might have. For some, it works great. For others, it does not.
On many of my applications we do not treat warnings as errors. We do this because these particular applications need to compile on several platforms with several compilers of varying ages. Sometimes we find it is actually impossible to fix a warning on one side without it turning into a warning on another platform. So we are merely careful. We respect warnings, but we don't bend over backwards for them.
Warnings consist of the best advice some of the most skilled C++ developers could bake into an application. They're worth keeping around.
C++, being a Turing complete language, has plenty of cases where the compiler must simply trust that you knew what you are doing. However, there are many cases where the compiler can realize that you probably did not intend to write what you wrote. A classic example is printf() codes which don't match the arguments, or std::strings passed to printf (not that that ever happens to me!). In these cases, the code you wrote is not an error. It is a valid C++ expression with a valid interpretation for the compiler to act on. But the compiler has a strong hunch that you simply overlooked something which is easy for a modern compiler to detect. These are warnings. They are things that are obvious to a compiler, using all the strict rules of C++ at its disposal, that you might have overlooked.
Turning warnings off, or ignoring them, is like choosing to ignore free advice from those more skilled than you. Its a lesson in huberis that ends either when you fly too close to the sun and your wings melt, or a memory corruption error occurs. Between the two, I'll take falling from the sky any day!
"Treat warnings as errors" is the extreme version of this philosophy. The idea here is that you resolve every warning the compiler gives you -- you listen to every bit of free advice and act on it. Whether this is a good model for development for you depends on the team and what kind of product you are working on. It's the ascetic approach that a monk might have. For some, it works great. For others, it does not.
On many of my applications we do not treat warnings as errors. We do this because these particular applications need to compile on several platforms with several compilers of varying ages. Sometimes we find it is actually impossible to fix a warning on one side without it turning into a warning on another platform. So we are merely careful. We respect warnings, but we don't bend over backwards for them.
answered 21 hours ago
Cort AmmonCort Ammon
6,04718 silver badges37 bronze badges
6,04718 silver badges37 bronze badges
2
What has C++ being Turing complete have to do with that. A lot of languages are turing complete and do not trust you if you do something wrong....
– Kami Kaze
18 hours ago
1
@KamiKaze every language will have idiomatic mistakes (e.g. Java can't stop you writing an inconsistentequals
/hashCode
), and it's a quality of implementation issue which of those are reported.
– Caleth
15 hours ago
@KamiKaze The Turing completeness bit comes in to show that there are cases where the compiler cannot prove that your code will fail to work as planned. This is important because compilers cannot make all "wrong" code an error. Errors can only be reserved for behaviors that the language designers are certain will always be "wrong." (typically because it leads down paths that are inconsistent).
– Cort Ammon
4 hours ago
Which also points to the challenge with "all warnings are errors." Warnings are, by design, more opportunistic, triggering on some potentially correct code in exchange for triggering on wrong code more often. Warnings as errors leads to you not being able to exercise the full language's capabilities.
– Cort Ammon
4 hours ago
add a comment |
2
What has C++ being Turing complete have to do with that. A lot of languages are turing complete and do not trust you if you do something wrong....
– Kami Kaze
18 hours ago
1
@KamiKaze every language will have idiomatic mistakes (e.g. Java can't stop you writing an inconsistentequals
/hashCode
), and it's a quality of implementation issue which of those are reported.
– Caleth
15 hours ago
@KamiKaze The Turing completeness bit comes in to show that there are cases where the compiler cannot prove that your code will fail to work as planned. This is important because compilers cannot make all "wrong" code an error. Errors can only be reserved for behaviors that the language designers are certain will always be "wrong." (typically because it leads down paths that are inconsistent).
– Cort Ammon
4 hours ago
Which also points to the challenge with "all warnings are errors." Warnings are, by design, more opportunistic, triggering on some potentially correct code in exchange for triggering on wrong code more often. Warnings as errors leads to you not being able to exercise the full language's capabilities.
– Cort Ammon
4 hours ago
2
2
What has C++ being Turing complete have to do with that. A lot of languages are turing complete and do not trust you if you do something wrong....
– Kami Kaze
18 hours ago
What has C++ being Turing complete have to do with that. A lot of languages are turing complete and do not trust you if you do something wrong....
– Kami Kaze
18 hours ago
1
1
@KamiKaze every language will have idiomatic mistakes (e.g. Java can't stop you writing an inconsistent
equals
/ hashCode
), and it's a quality of implementation issue which of those are reported.– Caleth
15 hours ago
@KamiKaze every language will have idiomatic mistakes (e.g. Java can't stop you writing an inconsistent
equals
/ hashCode
), and it's a quality of implementation issue which of those are reported.– Caleth
15 hours ago
@KamiKaze The Turing completeness bit comes in to show that there are cases where the compiler cannot prove that your code will fail to work as planned. This is important because compilers cannot make all "wrong" code an error. Errors can only be reserved for behaviors that the language designers are certain will always be "wrong." (typically because it leads down paths that are inconsistent).
– Cort Ammon
4 hours ago
@KamiKaze The Turing completeness bit comes in to show that there are cases where the compiler cannot prove that your code will fail to work as planned. This is important because compilers cannot make all "wrong" code an error. Errors can only be reserved for behaviors that the language designers are certain will always be "wrong." (typically because it leads down paths that are inconsistent).
– Cort Ammon
4 hours ago
Which also points to the challenge with "all warnings are errors." Warnings are, by design, more opportunistic, triggering on some potentially correct code in exchange for triggering on wrong code more often. Warnings as errors leads to you not being able to exercise the full language's capabilities.
– Cort Ammon
4 hours ago
Which also points to the challenge with "all warnings are errors." Warnings are, by design, more opportunistic, triggering on some potentially correct code in exchange for triggering on wrong code more often. Warnings as errors leads to you not being able to exercise the full language's capabilities.
– Cort Ammon
4 hours ago
add a comment |
Not only does handling the warnings make better code, it makes you a better programmer. Warnings will tell you about things that may seem little to you today, but one day that bad habit will come back and bite your head off.
Use the correct type, return that value, evaluate that return value. Take time and reflect "Is this really the correct type in this context?" "Do I need to return this?" And the biggie; "Is this code going to be portable for the next 10 years?"
Get into the habit of writing warning-free code in the first place.
add a comment |
Not only does handling the warnings make better code, it makes you a better programmer. Warnings will tell you about things that may seem little to you today, but one day that bad habit will come back and bite your head off.
Use the correct type, return that value, evaluate that return value. Take time and reflect "Is this really the correct type in this context?" "Do I need to return this?" And the biggie; "Is this code going to be portable for the next 10 years?"
Get into the habit of writing warning-free code in the first place.
add a comment |
Not only does handling the warnings make better code, it makes you a better programmer. Warnings will tell you about things that may seem little to you today, but one day that bad habit will come back and bite your head off.
Use the correct type, return that value, evaluate that return value. Take time and reflect "Is this really the correct type in this context?" "Do I need to return this?" And the biggie; "Is this code going to be portable for the next 10 years?"
Get into the habit of writing warning-free code in the first place.
Not only does handling the warnings make better code, it makes you a better programmer. Warnings will tell you about things that may seem little to you today, but one day that bad habit will come back and bite your head off.
Use the correct type, return that value, evaluate that return value. Take time and reflect "Is this really the correct type in this context?" "Do I need to return this?" And the biggie; "Is this code going to be portable for the next 10 years?"
Get into the habit of writing warning-free code in the first place.
answered 18 hours ago
RedSonjaRedSonja
2083 silver badges10 bronze badges
2083 silver badges10 bronze badges
add a comment |
add a comment |
Non-fixed warnings will, sooner or later, lead to errors in your code.
Debugging a segmentation fault, for instance, requires the programmer to trace the root (cause) of the fault, which usually is located in a prior place in your code than the line that eventually caused the segmentation fault.
It's very typical that the cause is a line for which the compiler had issued a warning that you ignored, and the line that caused the segmentation fault the line that eventually threw the error.
Fixing the warning leads to fixing the problem.. A classic!
A demonstration of the above.. Consider the following code:
#include <stdio.h>
int main(void)
char* str = "Hello world!";
int idx;
// Colossal amount of code here, irrelevant to 'idx'
printf("%cn", str[idx]);
return 0;
which when compiled with "Wextra" flag passed to GCC, gives:
main.c: In function 'main':
main.c:9:21: warning: 'idx' is used uninitialized in this function [-Wuninitialized]
9 | printf("%cn", str[idx]);
| ^
which I could ignore and execute the code anyway.. And then I would witness a "grand" segmentation fault, as my IP epicurus professor used to say:
Segmentation fault
In order to debug this in a real world scenario, one would start from the line that causes the segmentation fault and attempt to trace what is the root of the cause.. They would have to search for what has happened to i
and str
inside that colossal amount of code over there...
Until, one day, they found theirselves in the situation where the discover that i
is used uninitialized, thus it has a garbage value, which results in indexing the string (way) beyond out of its bounds, which leads to a segmentation fault.
If only they hadn't ignored the warning, they would have found the bug immediately!
add a comment |
Non-fixed warnings will, sooner or later, lead to errors in your code.
Debugging a segmentation fault, for instance, requires the programmer to trace the root (cause) of the fault, which usually is located in a prior place in your code than the line that eventually caused the segmentation fault.
It's very typical that the cause is a line for which the compiler had issued a warning that you ignored, and the line that caused the segmentation fault the line that eventually threw the error.
Fixing the warning leads to fixing the problem.. A classic!
A demonstration of the above.. Consider the following code:
#include <stdio.h>
int main(void)
char* str = "Hello world!";
int idx;
// Colossal amount of code here, irrelevant to 'idx'
printf("%cn", str[idx]);
return 0;
which when compiled with "Wextra" flag passed to GCC, gives:
main.c: In function 'main':
main.c:9:21: warning: 'idx' is used uninitialized in this function [-Wuninitialized]
9 | printf("%cn", str[idx]);
| ^
which I could ignore and execute the code anyway.. And then I would witness a "grand" segmentation fault, as my IP epicurus professor used to say:
Segmentation fault
In order to debug this in a real world scenario, one would start from the line that causes the segmentation fault and attempt to trace what is the root of the cause.. They would have to search for what has happened to i
and str
inside that colossal amount of code over there...
Until, one day, they found theirselves in the situation where the discover that i
is used uninitialized, thus it has a garbage value, which results in indexing the string (way) beyond out of its bounds, which leads to a segmentation fault.
If only they hadn't ignored the warning, they would have found the bug immediately!
add a comment |
Non-fixed warnings will, sooner or later, lead to errors in your code.
Debugging a segmentation fault, for instance, requires the programmer to trace the root (cause) of the fault, which usually is located in a prior place in your code than the line that eventually caused the segmentation fault.
It's very typical that the cause is a line for which the compiler had issued a warning that you ignored, and the line that caused the segmentation fault the line that eventually threw the error.
Fixing the warning leads to fixing the problem.. A classic!
A demonstration of the above.. Consider the following code:
#include <stdio.h>
int main(void)
char* str = "Hello world!";
int idx;
// Colossal amount of code here, irrelevant to 'idx'
printf("%cn", str[idx]);
return 0;
which when compiled with "Wextra" flag passed to GCC, gives:
main.c: In function 'main':
main.c:9:21: warning: 'idx' is used uninitialized in this function [-Wuninitialized]
9 | printf("%cn", str[idx]);
| ^
which I could ignore and execute the code anyway.. And then I would witness a "grand" segmentation fault, as my IP epicurus professor used to say:
Segmentation fault
In order to debug this in a real world scenario, one would start from the line that causes the segmentation fault and attempt to trace what is the root of the cause.. They would have to search for what has happened to i
and str
inside that colossal amount of code over there...
Until, one day, they found theirselves in the situation where the discover that i
is used uninitialized, thus it has a garbage value, which results in indexing the string (way) beyond out of its bounds, which leads to a segmentation fault.
If only they hadn't ignored the warning, they would have found the bug immediately!
Non-fixed warnings will, sooner or later, lead to errors in your code.
Debugging a segmentation fault, for instance, requires the programmer to trace the root (cause) of the fault, which usually is located in a prior place in your code than the line that eventually caused the segmentation fault.
It's very typical that the cause is a line for which the compiler had issued a warning that you ignored, and the line that caused the segmentation fault the line that eventually threw the error.
Fixing the warning leads to fixing the problem.. A classic!
A demonstration of the above.. Consider the following code:
#include <stdio.h>
int main(void)
char* str = "Hello world!";
int idx;
// Colossal amount of code here, irrelevant to 'idx'
printf("%cn", str[idx]);
return 0;
which when compiled with "Wextra" flag passed to GCC, gives:
main.c: In function 'main':
main.c:9:21: warning: 'idx' is used uninitialized in this function [-Wuninitialized]
9 | printf("%cn", str[idx]);
| ^
which I could ignore and execute the code anyway.. And then I would witness a "grand" segmentation fault, as my IP epicurus professor used to say:
Segmentation fault
In order to debug this in a real world scenario, one would start from the line that causes the segmentation fault and attempt to trace what is the root of the cause.. They would have to search for what has happened to i
and str
inside that colossal amount of code over there...
Until, one day, they found theirselves in the situation where the discover that i
is used uninitialized, thus it has a garbage value, which results in indexing the string (way) beyond out of its bounds, which leads to a segmentation fault.
If only they hadn't ignored the warning, they would have found the bug immediately!
answered 6 hours ago
gsamarasgsamaras
54.9k27 gold badges116 silver badges207 bronze badges
54.9k27 gold badges116 silver badges207 bronze badges
add a comment |
add a comment |
You should always enable compiler warnings because the compiler can often tell you what's wrong with your code. To do this, you pass -Wall -Wextra
to the compiler.
You should usually treat warnings as errors because often the warnings usually signify that there's something wrong with your code. However, it's often very easy to ignore these errors. Therefore, treating them as errors will cause the build to fail so you can't ignore the errors. To treat warnings as errors, pass -Werror
to the compiler.
add a comment |
You should always enable compiler warnings because the compiler can often tell you what's wrong with your code. To do this, you pass -Wall -Wextra
to the compiler.
You should usually treat warnings as errors because often the warnings usually signify that there's something wrong with your code. However, it's often very easy to ignore these errors. Therefore, treating them as errors will cause the build to fail so you can't ignore the errors. To treat warnings as errors, pass -Werror
to the compiler.
add a comment |
You should always enable compiler warnings because the compiler can often tell you what's wrong with your code. To do this, you pass -Wall -Wextra
to the compiler.
You should usually treat warnings as errors because often the warnings usually signify that there's something wrong with your code. However, it's often very easy to ignore these errors. Therefore, treating them as errors will cause the build to fail so you can't ignore the errors. To treat warnings as errors, pass -Werror
to the compiler.
You should always enable compiler warnings because the compiler can often tell you what's wrong with your code. To do this, you pass -Wall -Wextra
to the compiler.
You should usually treat warnings as errors because often the warnings usually signify that there's something wrong with your code. However, it's often very easy to ignore these errors. Therefore, treating them as errors will cause the build to fail so you can't ignore the errors. To treat warnings as errors, pass -Werror
to the compiler.
answered 14 hours ago
JL2210JL2210
3,8684 gold badges12 silver badges41 bronze badges
3,8684 gold badges12 silver badges41 bronze badges
add a comment |
add a comment |
Some warning may mean possible semantic error in code or possible UB. E.g. ;
after if()
, unused variable, global variable masked by local, or comparison of signed and unsigned. Many warnings are related to static code analyzer in compiler or to breaches of ISO standard detectable at compile time, which "require diagnostics". While those occurrences may be legal in one particular case, they would be result of design issues most of time.
Some compilers, e.g. gcc, have a command line option to activate "warnings as errors" mode, it's a nice , if cruel, tool to educate novice coders.
add a comment |
Some warning may mean possible semantic error in code or possible UB. E.g. ;
after if()
, unused variable, global variable masked by local, or comparison of signed and unsigned. Many warnings are related to static code analyzer in compiler or to breaches of ISO standard detectable at compile time, which "require diagnostics". While those occurrences may be legal in one particular case, they would be result of design issues most of time.
Some compilers, e.g. gcc, have a command line option to activate "warnings as errors" mode, it's a nice , if cruel, tool to educate novice coders.
add a comment |
Some warning may mean possible semantic error in code or possible UB. E.g. ;
after if()
, unused variable, global variable masked by local, or comparison of signed and unsigned. Many warnings are related to static code analyzer in compiler or to breaches of ISO standard detectable at compile time, which "require diagnostics". While those occurrences may be legal in one particular case, they would be result of design issues most of time.
Some compilers, e.g. gcc, have a command line option to activate "warnings as errors" mode, it's a nice , if cruel, tool to educate novice coders.
Some warning may mean possible semantic error in code or possible UB. E.g. ;
after if()
, unused variable, global variable masked by local, or comparison of signed and unsigned. Many warnings are related to static code analyzer in compiler or to breaches of ISO standard detectable at compile time, which "require diagnostics". While those occurrences may be legal in one particular case, they would be result of design issues most of time.
Some compilers, e.g. gcc, have a command line option to activate "warnings as errors" mode, it's a nice , if cruel, tool to educate novice coders.
answered yesterday
Swift - Friday PieSwift - Friday Pie
4,4711 gold badge11 silver badges27 bronze badges
4,4711 gold badge11 silver badges27 bronze badges
add a comment |
add a comment |
Treating warnings as errors is just a mean of self-discipline: you were compiling a program to test that shiny new feature, but you can't until you fix the sloppy parts. There is no additional information Werror
provides, it just sets priorities very clearly:
Don't add new code until you fix problems in the existing code
It's really the mindset that's important, not the tools. Compiler diagnostics output is a tool. MISRA (for embedded C) is another tool. It doesn't matter which one you use, but arguably compiler warnings is the easiest tool you can get (it's just one flag to set) and the signal to noise ratio is very high. So there's no reason not to use it.
No tool is infallible. If you write const float pi = 3.14;
, no tool will tell you that you defined π with a bad precision which may lead to problems down the road. Most tools won't raise an eyebrow on if(tmp < 42)
, even if it's commonly known that giving variables meaningless names and using magic numbers is a way to disaster in big projects. You have to understand that any "quick test" code you write is just that: a test, and you have to get it right before you move on to other tasks, while you still see its shortcomings. If you leave that codes as is, debugging if after you spend two months adding new features will be significantly harder.
Once you get into the right mindset, there is no point in using Werror
. Having warnings as warnings will allow you to take an informed decision whether it still makes sense to run that debug session you were about to start, or to abort it and fix the warnings first.
add a comment |
Treating warnings as errors is just a mean of self-discipline: you were compiling a program to test that shiny new feature, but you can't until you fix the sloppy parts. There is no additional information Werror
provides, it just sets priorities very clearly:
Don't add new code until you fix problems in the existing code
It's really the mindset that's important, not the tools. Compiler diagnostics output is a tool. MISRA (for embedded C) is another tool. It doesn't matter which one you use, but arguably compiler warnings is the easiest tool you can get (it's just one flag to set) and the signal to noise ratio is very high. So there's no reason not to use it.
No tool is infallible. If you write const float pi = 3.14;
, no tool will tell you that you defined π with a bad precision which may lead to problems down the road. Most tools won't raise an eyebrow on if(tmp < 42)
, even if it's commonly known that giving variables meaningless names and using magic numbers is a way to disaster in big projects. You have to understand that any "quick test" code you write is just that: a test, and you have to get it right before you move on to other tasks, while you still see its shortcomings. If you leave that codes as is, debugging if after you spend two months adding new features will be significantly harder.
Once you get into the right mindset, there is no point in using Werror
. Having warnings as warnings will allow you to take an informed decision whether it still makes sense to run that debug session you were about to start, or to abort it and fix the warnings first.
add a comment |
Treating warnings as errors is just a mean of self-discipline: you were compiling a program to test that shiny new feature, but you can't until you fix the sloppy parts. There is no additional information Werror
provides, it just sets priorities very clearly:
Don't add new code until you fix problems in the existing code
It's really the mindset that's important, not the tools. Compiler diagnostics output is a tool. MISRA (for embedded C) is another tool. It doesn't matter which one you use, but arguably compiler warnings is the easiest tool you can get (it's just one flag to set) and the signal to noise ratio is very high. So there's no reason not to use it.
No tool is infallible. If you write const float pi = 3.14;
, no tool will tell you that you defined π with a bad precision which may lead to problems down the road. Most tools won't raise an eyebrow on if(tmp < 42)
, even if it's commonly known that giving variables meaningless names and using magic numbers is a way to disaster in big projects. You have to understand that any "quick test" code you write is just that: a test, and you have to get it right before you move on to other tasks, while you still see its shortcomings. If you leave that codes as is, debugging if after you spend two months adding new features will be significantly harder.
Once you get into the right mindset, there is no point in using Werror
. Having warnings as warnings will allow you to take an informed decision whether it still makes sense to run that debug session you were about to start, or to abort it and fix the warnings first.
Treating warnings as errors is just a mean of self-discipline: you were compiling a program to test that shiny new feature, but you can't until you fix the sloppy parts. There is no additional information Werror
provides, it just sets priorities very clearly:
Don't add new code until you fix problems in the existing code
It's really the mindset that's important, not the tools. Compiler diagnostics output is a tool. MISRA (for embedded C) is another tool. It doesn't matter which one you use, but arguably compiler warnings is the easiest tool you can get (it's just one flag to set) and the signal to noise ratio is very high. So there's no reason not to use it.
No tool is infallible. If you write const float pi = 3.14;
, no tool will tell you that you defined π with a bad precision which may lead to problems down the road. Most tools won't raise an eyebrow on if(tmp < 42)
, even if it's commonly known that giving variables meaningless names and using magic numbers is a way to disaster in big projects. You have to understand that any "quick test" code you write is just that: a test, and you have to get it right before you move on to other tasks, while you still see its shortcomings. If you leave that codes as is, debugging if after you spend two months adding new features will be significantly harder.
Once you get into the right mindset, there is no point in using Werror
. Having warnings as warnings will allow you to take an informed decision whether it still makes sense to run that debug session you were about to start, or to abort it and fix the warnings first.
edited 17 hours ago
answered 17 hours ago
Dmitry GrigoryevDmitry Grigoryev
2,1891 gold badge17 silver badges41 bronze badges
2,1891 gold badge17 silver badges41 bronze badges
add a comment |
add a comment |
The other answers are excellent and I don't want to repeat what they have said.
One other aspect to "why enable warnings" that hasn't properly been touched on is that they help enormously with code maintenance. When you write a program of significant size, it becomes impossible to keep the whole thing in your head at once. You typically have a function or three that you're actively writing and thinking about, and perhaps a file or three on your screen that you can refer to, but the bulk of the program exists in the background somewhere and you have to trust that it keeps working.
Having warnings on, and having them as energetic and in your face as possible, helps to alert you if something you change makes trouble for something that you can't see.
Take for example, the clang warning -Wswitch-enum
. That triggers a warning if you use a switch on an enum and miss out one of the possible enum values. It's something you might think would be an unlikely mistake to make: you probably at least looked at the list of enum values when you wrote the switch statement. You might even have an IDE that generated the switch options for you, leaving no room for human error.
This warning really comes into its own when, six months later you add another possible entry to the enum. Again, if you're thinking about the code in question you'll probably be fine. But if this enum is used for multiple different purposes and it's for one of those that you need the extra option, it's very easy to forget to update a switch in a file you haven't touched for 6 months.
You can think of warnings in the same way as you'd think of automated test cases: they help you make sure that the code is sensible and doing what you need when you first write it, but they help even more to make sure that it keeps doing what you need while you prod at it. The difference is that test cases work very narrowly to the requirements of your code and you have to write them, while warnings work broadly to sensible standards for almost all code, and they're very generously supplied by the boffins who make the compilers.
add a comment |
The other answers are excellent and I don't want to repeat what they have said.
One other aspect to "why enable warnings" that hasn't properly been touched on is that they help enormously with code maintenance. When you write a program of significant size, it becomes impossible to keep the whole thing in your head at once. You typically have a function or three that you're actively writing and thinking about, and perhaps a file or three on your screen that you can refer to, but the bulk of the program exists in the background somewhere and you have to trust that it keeps working.
Having warnings on, and having them as energetic and in your face as possible, helps to alert you if something you change makes trouble for something that you can't see.
Take for example, the clang warning -Wswitch-enum
. That triggers a warning if you use a switch on an enum and miss out one of the possible enum values. It's something you might think would be an unlikely mistake to make: you probably at least looked at the list of enum values when you wrote the switch statement. You might even have an IDE that generated the switch options for you, leaving no room for human error.
This warning really comes into its own when, six months later you add another possible entry to the enum. Again, if you're thinking about the code in question you'll probably be fine. But if this enum is used for multiple different purposes and it's for one of those that you need the extra option, it's very easy to forget to update a switch in a file you haven't touched for 6 months.
You can think of warnings in the same way as you'd think of automated test cases: they help you make sure that the code is sensible and doing what you need when you first write it, but they help even more to make sure that it keeps doing what you need while you prod at it. The difference is that test cases work very narrowly to the requirements of your code and you have to write them, while warnings work broadly to sensible standards for almost all code, and they're very generously supplied by the boffins who make the compilers.
add a comment |
The other answers are excellent and I don't want to repeat what they have said.
One other aspect to "why enable warnings" that hasn't properly been touched on is that they help enormously with code maintenance. When you write a program of significant size, it becomes impossible to keep the whole thing in your head at once. You typically have a function or three that you're actively writing and thinking about, and perhaps a file or three on your screen that you can refer to, but the bulk of the program exists in the background somewhere and you have to trust that it keeps working.
Having warnings on, and having them as energetic and in your face as possible, helps to alert you if something you change makes trouble for something that you can't see.
Take for example, the clang warning -Wswitch-enum
. That triggers a warning if you use a switch on an enum and miss out one of the possible enum values. It's something you might think would be an unlikely mistake to make: you probably at least looked at the list of enum values when you wrote the switch statement. You might even have an IDE that generated the switch options for you, leaving no room for human error.
This warning really comes into its own when, six months later you add another possible entry to the enum. Again, if you're thinking about the code in question you'll probably be fine. But if this enum is used for multiple different purposes and it's for one of those that you need the extra option, it's very easy to forget to update a switch in a file you haven't touched for 6 months.
You can think of warnings in the same way as you'd think of automated test cases: they help you make sure that the code is sensible and doing what you need when you first write it, but they help even more to make sure that it keeps doing what you need while you prod at it. The difference is that test cases work very narrowly to the requirements of your code and you have to write them, while warnings work broadly to sensible standards for almost all code, and they're very generously supplied by the boffins who make the compilers.
The other answers are excellent and I don't want to repeat what they have said.
One other aspect to "why enable warnings" that hasn't properly been touched on is that they help enormously with code maintenance. When you write a program of significant size, it becomes impossible to keep the whole thing in your head at once. You typically have a function or three that you're actively writing and thinking about, and perhaps a file or three on your screen that you can refer to, but the bulk of the program exists in the background somewhere and you have to trust that it keeps working.
Having warnings on, and having them as energetic and in your face as possible, helps to alert you if something you change makes trouble for something that you can't see.
Take for example, the clang warning -Wswitch-enum
. That triggers a warning if you use a switch on an enum and miss out one of the possible enum values. It's something you might think would be an unlikely mistake to make: you probably at least looked at the list of enum values when you wrote the switch statement. You might even have an IDE that generated the switch options for you, leaving no room for human error.
This warning really comes into its own when, six months later you add another possible entry to the enum. Again, if you're thinking about the code in question you'll probably be fine. But if this enum is used for multiple different purposes and it's for one of those that you need the extra option, it's very easy to forget to update a switch in a file you haven't touched for 6 months.
You can think of warnings in the same way as you'd think of automated test cases: they help you make sure that the code is sensible and doing what you need when you first write it, but they help even more to make sure that it keeps doing what you need while you prod at it. The difference is that test cases work very narrowly to the requirements of your code and you have to write them, while warnings work broadly to sensible standards for almost all code, and they're very generously supplied by the boffins who make the compilers.
answered 4 hours ago
JosiahJosiah
2832 silver badges11 bronze badges
2832 silver badges11 bronze badges
add a comment |
add a comment |
As someone who works with legacy embedded C code, enabling compiler warnings has helped show a lot of weakness and areas to investigate when proposing fixes. In gcc utilizing -Wall
and -Wextra
and even -Wshadow
have become vital. I'm not going to go every single hazard, but I'll list a few that have popped up that helped show code issues.
Variables being left behind
This one can easily point to unfinished work and areas that might not be utilizing all of the passed variables which could be an issue. Let's look at a simple function that may trigger this:
int foo(int a, int b)
int c = 0;
if (a > 0)
return a;
return 0;
Just compiling this without -Wall or -Wextra returns no issues. -Wall will tell you though that c
is never used:
foo.c: In function ‘foo’:
foo.c:9:20: warning: unused variable ‘c’
[-Wunused-variable]
-Wextra will also tell you that your parameter b doesn't do anything:
foo.c: In function ‘foo’:
foo.c:9:20: warning: unused variable ‘c’
[-Wunused-variable]
foo.c:7:20: warning: unused parameter ‘b’ [-Wunused-parameter] int foo(int a, int b)
Global Variable shadowing
This one bit hard and did not show up until -Wshadow
was used. Let's modify the example above to just add, but there just happens to be a global with the same name as a local which causes a lot of confusion when trying to use both.
int c = 7;
int foo(int a, int b)
int c = a + b;
return c;
When -Wshadow was turned on, it's easy to spot this issue.
foo.c:11:9: warning: declaration of ‘c’ shadows a global declaration
[-Wshadow]
foo.c:1:5: note: shadowed declaration is here
Format strings
This doesn't require any extra flags in gcc, but it has still be the source of problems in the past. A simple function trying to print data, but has a formatting error could look like this:
void foo(const char * str)
printf("str = %dn", str);
This doesn't print the string since the formatting flag is wrong and gcc will happily tell you this is probably not what you wanted:
foo.c: In function ‘foo’:
foo.c:10:12: warning: format ‘%d’ expects
argument of type ‘int’, but argument 2 has type ‘const char *’
[-Wformat=]
These are just three of the many things the compiler can double check for you. There are a lot of others like using an uninitialized variable that others have pointed out.
add a comment |
As someone who works with legacy embedded C code, enabling compiler warnings has helped show a lot of weakness and areas to investigate when proposing fixes. In gcc utilizing -Wall
and -Wextra
and even -Wshadow
have become vital. I'm not going to go every single hazard, but I'll list a few that have popped up that helped show code issues.
Variables being left behind
This one can easily point to unfinished work and areas that might not be utilizing all of the passed variables which could be an issue. Let's look at a simple function that may trigger this:
int foo(int a, int b)
int c = 0;
if (a > 0)
return a;
return 0;
Just compiling this without -Wall or -Wextra returns no issues. -Wall will tell you though that c
is never used:
foo.c: In function ‘foo’:
foo.c:9:20: warning: unused variable ‘c’
[-Wunused-variable]
-Wextra will also tell you that your parameter b doesn't do anything:
foo.c: In function ‘foo’:
foo.c:9:20: warning: unused variable ‘c’
[-Wunused-variable]
foo.c:7:20: warning: unused parameter ‘b’ [-Wunused-parameter] int foo(int a, int b)
Global Variable shadowing
This one bit hard and did not show up until -Wshadow
was used. Let's modify the example above to just add, but there just happens to be a global with the same name as a local which causes a lot of confusion when trying to use both.
int c = 7;
int foo(int a, int b)
int c = a + b;
return c;
When -Wshadow was turned on, it's easy to spot this issue.
foo.c:11:9: warning: declaration of ‘c’ shadows a global declaration
[-Wshadow]
foo.c:1:5: note: shadowed declaration is here
Format strings
This doesn't require any extra flags in gcc, but it has still be the source of problems in the past. A simple function trying to print data, but has a formatting error could look like this:
void foo(const char * str)
printf("str = %dn", str);
This doesn't print the string since the formatting flag is wrong and gcc will happily tell you this is probably not what you wanted:
foo.c: In function ‘foo’:
foo.c:10:12: warning: format ‘%d’ expects
argument of type ‘int’, but argument 2 has type ‘const char *’
[-Wformat=]
These are just three of the many things the compiler can double check for you. There are a lot of others like using an uninitialized variable that others have pointed out.
add a comment |
As someone who works with legacy embedded C code, enabling compiler warnings has helped show a lot of weakness and areas to investigate when proposing fixes. In gcc utilizing -Wall
and -Wextra
and even -Wshadow
have become vital. I'm not going to go every single hazard, but I'll list a few that have popped up that helped show code issues.
Variables being left behind
This one can easily point to unfinished work and areas that might not be utilizing all of the passed variables which could be an issue. Let's look at a simple function that may trigger this:
int foo(int a, int b)
int c = 0;
if (a > 0)
return a;
return 0;
Just compiling this without -Wall or -Wextra returns no issues. -Wall will tell you though that c
is never used:
foo.c: In function ‘foo’:
foo.c:9:20: warning: unused variable ‘c’
[-Wunused-variable]
-Wextra will also tell you that your parameter b doesn't do anything:
foo.c: In function ‘foo’:
foo.c:9:20: warning: unused variable ‘c’
[-Wunused-variable]
foo.c:7:20: warning: unused parameter ‘b’ [-Wunused-parameter] int foo(int a, int b)
Global Variable shadowing
This one bit hard and did not show up until -Wshadow
was used. Let's modify the example above to just add, but there just happens to be a global with the same name as a local which causes a lot of confusion when trying to use both.
int c = 7;
int foo(int a, int b)
int c = a + b;
return c;
When -Wshadow was turned on, it's easy to spot this issue.
foo.c:11:9: warning: declaration of ‘c’ shadows a global declaration
[-Wshadow]
foo.c:1:5: note: shadowed declaration is here
Format strings
This doesn't require any extra flags in gcc, but it has still be the source of problems in the past. A simple function trying to print data, but has a formatting error could look like this:
void foo(const char * str)
printf("str = %dn", str);
This doesn't print the string since the formatting flag is wrong and gcc will happily tell you this is probably not what you wanted:
foo.c: In function ‘foo’:
foo.c:10:12: warning: format ‘%d’ expects
argument of type ‘int’, but argument 2 has type ‘const char *’
[-Wformat=]
These are just three of the many things the compiler can double check for you. There are a lot of others like using an uninitialized variable that others have pointed out.
As someone who works with legacy embedded C code, enabling compiler warnings has helped show a lot of weakness and areas to investigate when proposing fixes. In gcc utilizing -Wall
and -Wextra
and even -Wshadow
have become vital. I'm not going to go every single hazard, but I'll list a few that have popped up that helped show code issues.
Variables being left behind
This one can easily point to unfinished work and areas that might not be utilizing all of the passed variables which could be an issue. Let's look at a simple function that may trigger this:
int foo(int a, int b)
int c = 0;
if (a > 0)
return a;
return 0;
Just compiling this without -Wall or -Wextra returns no issues. -Wall will tell you though that c
is never used:
foo.c: In function ‘foo’:
foo.c:9:20: warning: unused variable ‘c’
[-Wunused-variable]
-Wextra will also tell you that your parameter b doesn't do anything:
foo.c: In function ‘foo’:
foo.c:9:20: warning: unused variable ‘c’
[-Wunused-variable]
foo.c:7:20: warning: unused parameter ‘b’ [-Wunused-parameter] int foo(int a, int b)
Global Variable shadowing
This one bit hard and did not show up until -Wshadow
was used. Let's modify the example above to just add, but there just happens to be a global with the same name as a local which causes a lot of confusion when trying to use both.
int c = 7;
int foo(int a, int b)
int c = a + b;
return c;
When -Wshadow was turned on, it's easy to spot this issue.
foo.c:11:9: warning: declaration of ‘c’ shadows a global declaration
[-Wshadow]
foo.c:1:5: note: shadowed declaration is here
Format strings
This doesn't require any extra flags in gcc, but it has still be the source of problems in the past. A simple function trying to print data, but has a formatting error could look like this:
void foo(const char * str)
printf("str = %dn", str);
This doesn't print the string since the formatting flag is wrong and gcc will happily tell you this is probably not what you wanted:
foo.c: In function ‘foo’:
foo.c:10:12: warning: format ‘%d’ expects
argument of type ‘int’, but argument 2 has type ‘const char *’
[-Wformat=]
These are just three of the many things the compiler can double check for you. There are a lot of others like using an uninitialized variable that others have pointed out.
answered 4 hours ago
DomDom
1,3144 gold badges22 silver badges31 bronze badges
1,3144 gold badges22 silver badges31 bronze badges
add a comment |
add a comment |
This is a specific answer to C, and why this is far more important to C than to anything else.
#include <stdio.h>
int main()
FILE *fp = "some string";
This code compiles with a warning. What are and should be errors in just about every other language on the planet (barring assembly language) are warnings in C. Warnings in C are almost always errors in disguise. Warnings should be fixed, not suppressed.
With gcc
, we do this as gcc -Wall -Werror
.
This was also the reason for the high rantyness about some MS non-secure API warnings. Most people programming C have learned the hard way to treat warnings as errors and this stuff appeared that just wasn't the same kind of thing and wanted non-portable fixes.
add a comment |
This is a specific answer to C, and why this is far more important to C than to anything else.
#include <stdio.h>
int main()
FILE *fp = "some string";
This code compiles with a warning. What are and should be errors in just about every other language on the planet (barring assembly language) are warnings in C. Warnings in C are almost always errors in disguise. Warnings should be fixed, not suppressed.
With gcc
, we do this as gcc -Wall -Werror
.
This was also the reason for the high rantyness about some MS non-secure API warnings. Most people programming C have learned the hard way to treat warnings as errors and this stuff appeared that just wasn't the same kind of thing and wanted non-portable fixes.
add a comment |
This is a specific answer to C, and why this is far more important to C than to anything else.
#include <stdio.h>
int main()
FILE *fp = "some string";
This code compiles with a warning. What are and should be errors in just about every other language on the planet (barring assembly language) are warnings in C. Warnings in C are almost always errors in disguise. Warnings should be fixed, not suppressed.
With gcc
, we do this as gcc -Wall -Werror
.
This was also the reason for the high rantyness about some MS non-secure API warnings. Most people programming C have learned the hard way to treat warnings as errors and this stuff appeared that just wasn't the same kind of thing and wanted non-portable fixes.
This is a specific answer to C, and why this is far more important to C than to anything else.
#include <stdio.h>
int main()
FILE *fp = "some string";
This code compiles with a warning. What are and should be errors in just about every other language on the planet (barring assembly language) are warnings in C. Warnings in C are almost always errors in disguise. Warnings should be fixed, not suppressed.
With gcc
, we do this as gcc -Wall -Werror
.
This was also the reason for the high rantyness about some MS non-secure API warnings. Most people programming C have learned the hard way to treat warnings as errors and this stuff appeared that just wasn't the same kind of thing and wanted non-portable fixes.
edited 32 mins ago
answered 4 hours ago
JoshuaJoshua
26.5k5 gold badges50 silver badges108 bronze badges
26.5k5 gold badges50 silver badges108 bronze badges
add a comment |
add a comment |
Take it easy: you don't have to, it is not necessary. -Wall and -Werror was designed by code-refactoring maniacs for themselves. But other people who hates when others brake their code also find it useful.
So, this is just an optional feature, which is totally up to your preference to use or not.
I use it all the time because it helps to fix my mistakes.
10
Although it is not mandatory, it is highly recommended to use them
– Spikatrix
21 hours ago
2
-Wall and -Werror was designed by code-refactoring maniacs for themselves.
[citation needed]
– YSC
10 hours ago
2
It seems like you're contradicting yourself. If you "use it all the time because it helps to fix [your] mistakes," isn't it worth teaching to newer programmers so that they'll be doing it everywhere from the get go? I don't think this question is asking whether or not it's possible to compile without-Wall
and-Werror
, it's just asking if it's a good idea. Which, from your last sentence, it sounds like you're saying it is.
– scohe001
5 hours ago
add a comment |
Take it easy: you don't have to, it is not necessary. -Wall and -Werror was designed by code-refactoring maniacs for themselves. But other people who hates when others brake their code also find it useful.
So, this is just an optional feature, which is totally up to your preference to use or not.
I use it all the time because it helps to fix my mistakes.
10
Although it is not mandatory, it is highly recommended to use them
– Spikatrix
21 hours ago
2
-Wall and -Werror was designed by code-refactoring maniacs for themselves.
[citation needed]
– YSC
10 hours ago
2
It seems like you're contradicting yourself. If you "use it all the time because it helps to fix [your] mistakes," isn't it worth teaching to newer programmers so that they'll be doing it everywhere from the get go? I don't think this question is asking whether or not it's possible to compile without-Wall
and-Werror
, it's just asking if it's a good idea. Which, from your last sentence, it sounds like you're saying it is.
– scohe001
5 hours ago
add a comment |
Take it easy: you don't have to, it is not necessary. -Wall and -Werror was designed by code-refactoring maniacs for themselves. But other people who hates when others brake their code also find it useful.
So, this is just an optional feature, which is totally up to your preference to use or not.
I use it all the time because it helps to fix my mistakes.
Take it easy: you don't have to, it is not necessary. -Wall and -Werror was designed by code-refactoring maniacs for themselves. But other people who hates when others brake their code also find it useful.
So, this is just an optional feature, which is totally up to your preference to use or not.
I use it all the time because it helps to fix my mistakes.
edited 21 hours ago
answered 21 hours ago
sqr163sqr163
7369 silver badges17 bronze badges
7369 silver badges17 bronze badges
10
Although it is not mandatory, it is highly recommended to use them
– Spikatrix
21 hours ago
2
-Wall and -Werror was designed by code-refactoring maniacs for themselves.
[citation needed]
– YSC
10 hours ago
2
It seems like you're contradicting yourself. If you "use it all the time because it helps to fix [your] mistakes," isn't it worth teaching to newer programmers so that they'll be doing it everywhere from the get go? I don't think this question is asking whether or not it's possible to compile without-Wall
and-Werror
, it's just asking if it's a good idea. Which, from your last sentence, it sounds like you're saying it is.
– scohe001
5 hours ago
add a comment |
10
Although it is not mandatory, it is highly recommended to use them
– Spikatrix
21 hours ago
2
-Wall and -Werror was designed by code-refactoring maniacs for themselves.
[citation needed]
– YSC
10 hours ago
2
It seems like you're contradicting yourself. If you "use it all the time because it helps to fix [your] mistakes," isn't it worth teaching to newer programmers so that they'll be doing it everywhere from the get go? I don't think this question is asking whether or not it's possible to compile without-Wall
and-Werror
, it's just asking if it's a good idea. Which, from your last sentence, it sounds like you're saying it is.
– scohe001
5 hours ago
10
10
Although it is not mandatory, it is highly recommended to use them
– Spikatrix
21 hours ago
Although it is not mandatory, it is highly recommended to use them
– Spikatrix
21 hours ago
2
2
-Wall and -Werror was designed by code-refactoring maniacs for themselves.
[citation needed]– YSC
10 hours ago
-Wall and -Werror was designed by code-refactoring maniacs for themselves.
[citation needed]– YSC
10 hours ago
2
2
It seems like you're contradicting yourself. If you "use it all the time because it helps to fix [your] mistakes," isn't it worth teaching to newer programmers so that they'll be doing it everywhere from the get go? I don't think this question is asking whether or not it's possible to compile without
-Wall
and -Werror
, it's just asking if it's a good idea. Which, from your last sentence, it sounds like you're saying it is.– scohe001
5 hours ago
It seems like you're contradicting yourself. If you "use it all the time because it helps to fix [your] mistakes," isn't it worth teaching to newer programmers so that they'll be doing it everywhere from the get go? I don't think this question is asking whether or not it's possible to compile without
-Wall
and -Werror
, it's just asking if it's a good idea. Which, from your last sentence, it sounds like you're saying it is.– scohe001
5 hours ago
add a comment |
Thanks for contributing an answer to Stack Overflow!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f57842756%2fhow-do-i-always-enable-compiler-warnings%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
7
these questions and their answers are so canonical that they should be on the first page (in bold or otherwise emphasized) of any decend learning book !
– engf-010
yesterday
2
I suspect the people who tell you this have never spent an hour or more looking for the one actual error in pages of meaningless warnings.
– jamesqf
22 hours ago
55
@jamesqf The point is to fix the warnings, not to just have loads of them sitting around cluttering up the output.
– David Z
19 hours ago
11
@jamesqf If you have hundreds of thousands of lines of code that have been developed over years and never compiled with warnings enabled, then yes, turning on warnings at that point is either useless, or mandates a huge catch-up effort to fix them all (no matter what) in order to make them useful again. But that's almost completely beside the point of this question, which is to encourage people to enable warnings (and take them seriously) from the beginning, so that they don't get into horrible binds like that.
– Steve Summit
13 hours ago
4
@jamesqf And what you should do in that case is so called "maintenance" of the code to keep it to current specs. ;) Obviously there are codebases where this isn't thought to be economical.
– Nobody
9 hours ago