Why you should want to work for us, and not them.
According to Harvard Business Review, the term employer brand has been around since the mid-1990s. Employer brand denotes an organization's reputation as an employer, as opposed to its more general corporate brand reputation.