This is the founding premise of the science fiction trilogy The Three Body Problem by Liu Cixin — adapted from philosophy.stackexchange.com:

It starts with two axioms:

Survival is the most important goal of every civilization.
Every civilization will continue to expand and grow, but resources in the universe are finite.

and with two assumptions:

Suspicion Chain
Technology Explosion

Let's start with a thought experiment. We assume that civilization A discovered civilization B.

Civilization A has two primary choices:

Do nothing.
Contact in a certain way.

Now we note that we simplify this problem by categorizing two kinds of civilization in the universe.

Hostile and Friendly

Hostile civilizations will attack another civilization when another civilization is discovered. Friendly civilizations will only attack when is threatened.

Now here comes suspicion chain:

Assuming A is friendly, however A has no way of knowing if B is friendly and vice versa.
Even if A knows B is friendly and B knows A is friendly, how does A knows that B thinks A is friendly and vice versa?

Now if A knows B thinks A is friendly, how does A know B knows A knows B thinks A is friendly and vice versa?

This is an endless cycle. And although in Earth we can eliminate the suspicion by communicating. In space, the communication is limited by the speed of light, and therefore you cannot be sure if the attack is under way while you are communicating. It is optimal not to contact. And let us be pedantic here, we note it is optimal not to contact when B is more technological advance than A as A has no way of knowing B's true intention due to suspicion chain. It is noteworthy to say that if A is more technologically advance there are no risk.

However here comes another issue. Going back to our thought experiment's second assumption. If A does nothing or proceed to contact, we note that there is a chance that technology explosion might occur in B's civilization. B might be able to surpass A technologically. Now since A has no way of knowing B's true intention, to ensure the first axiom A has only one optimal move: destroy any civilization when discovered.

This is logical:

Consider the following payoff matrix in game theory:

B \ A Contact(Destroy) Not Contact

Contact(Destroy) (0,0) (0, -infinity)

Not Contact (-infinity, 0) (-K, -L)

for which K is the probability that A surpass B and that A is hostile and L is the probability that B surpass A and that B is hostile.

We can see that the only clear Nash Equilibrium is (Destroy, Destroy)

Note: Obviously this is a crude method of estimating outcomes, but we can deduce from Natural Selection that there is probably only three kinds of surviving civilization: Not yet detected, Very good at hiding oneself, and Always destroy.

This conclusion gives rise to the Dark Forest Postulates:

The universe is a dark forest. Every civilization is a hunter with a gun. They pass quietly through the forest like a ghost. They must be extremely cautious and try their best to keep silence, because they know there are any number of hunters out there. And if a hunter discovers another, no matter if he is an angel or a demon, an old or a young civilization, the only thing he can do is to kill it. In this forest, other hunters are the eternal threat. Any civilization that reveals its location will be destroyed.