"Broadly speaking, post-feminism refers to the idea that modern (‘Western’) society has now achieved for women the goals sought by previous feminist movements — or at least the most ‘acceptable’ and less radical goals, anyway."
Although people in the 1970s didn't speak of post-feminism, I remember hearing these arguments then as a boy.
My mother would say something like, "Of course I want equal pay and treatment, and I think I pretty much get them. That doesn't mean I have to agree with feminism."
She said this as a divorced woman struggling to hold down two part-time jobs and finish the degree in chemistry she never got when she married my dad.
She was struggling, of course, because of disadvantages society imposed on her as a woman.
When she did get her degree and eventually ended up with a well-paying job, she owed that to the very feminism she disparaged.
Plus ça change ...