That being said, for the most basic concepts the notation is pretty consistent so if you skim through one or two books you'll be able to get the feel for it. Understanding the actual math—that will take longer.
As for references, here is a very comprehensive standard, ISO 80000-2 that defines recommendations for many of the math symbols, with mentions of other variations: https://people.engr.ncsu.edu/jwilson/files/mathsigns.pdf#pag...
For something shorter (and less complete), you can also check the notation appendices in my books: https://minireference.com/static/excerpts/noBSguide_v5_previ... https://minireference.com/static/excerpts/noBSguide2LA_previ...
- Mathematics for Computer Science: https://courses.csail.mit.edu/6.042/spring17/mcs.pdf
- Calculus Made Easy: http://calculusmadeeasy.org
Not directly related to your question but useful for interviews and programming puzzles nonetheless:
- Algorithms and Data Structures, The Basic Toolbox: https://people.mpi-inf.mpg.de/~mehlhorn/ftp/Mehlhorn-Sanders...
- Basic Proof Techniques: https://www.cse.wustl.edu/~cytron/547Pages/f14/IntroToProofs...
https://en.wikipedia.org/wiki/Logical_connective#Common_logi...
https://en.wikipedia.org/wiki/Quantifier_(logic)#Notation
https://en.wikipedia.org/wiki/Set_theory#Basic_concepts_and_...
https://en.wikipedia.org/wiki/Summation#Capital-sigma_notati...
https://en.wikipedia.org/wiki/Modular_arithmetic#Congruence
If Wikipedia is too hard to follow, you can learn this from early chapters of a discrete mathematics textbook.
"Mathematical Notation: A Guide for Engineers and Scientists"
https://www.amazon.com/Mathematical-Notation-Guide-Engineers...
Sure, one can perhaps expect that something that uses an integral sign shares some properties with ordinary integration of real functions, but to really understand what the notation entails, one really has to study the underlying material.
I feel that what you're asking for is kind of akin to wanting to read a novel in a foreign language using only a dictionary of the 10% most commonly used words of said language, with each entry resolving only to one meaning of the word.
Two excellent resources are:
1. Introduction to Mathematical Thinking (if you prefer moocs) - https://www.coursera.org/learn/mathematical-thinking?
2. How to think Like a Mathematican - https://www.amazon.co.uk/How-Think-Like-Mathematician-Underg...
I think that the single biggest advantage one can have (in programming that does something "non-trivial" - loaded term I know, rather than as a person) is to have a firm grasp of the mathematical basis of their work. It's so much easier to start something new when you can derive it yourself.
If you have the time, I recommend "Advanced Engineering mathematics" for the gap between calculus to applications and other topics like Linear Algebra, analysis, and graph theory.
If you just want a mapping of symbols to words try LaTeX documentation
A la https://oeis.org/wiki/List_of_LaTeX_mathematical_symbols
Previous HN discussion / it was on the front page earlier this week:
Source: I am a mathematician
In more specialized areas like type theory, first order logic, predicate calculus, temporal logic, etc you have to pick it up as you go.
(if you're typesetting math it's invaluable, not just decent)