1. Essentially...

2. Fluids

3. Situations

4. Social proof

5. Continuum

6. Gender for sale

7. Refinement

8. Resistance

Marketers and predictive-analytics firms collect enough information about us to know what our gender is better than we do!
Within data sets, people's gender can be deduced from other information collected about them, say, the online sites they visit, what sorts of goods they purchase, the shape of their social network, the frequency of their interactions with various platforms, and so on. The gender a predictive algorithm deduces for a particular individual may not match what that individual believes her or his gender to be. In a world increasingly governed by algorithms, the individual will be wrong and the algorithm right.
As John Cheney-Lippold claims in “A New Algorithmic Identity: Soft Biopolitics and the Modulation of Control,” "how a variable like X comes to be defined is not the result of objective fact but is rather a technologically-mediated and culturally-situated consequence of statistics and computer science.” This means gender, at least within big datasets (and what isn't within a big dataset these days?), is as fluid as the algorithm that defines it within the database.
The way gender is defined algorithmically can vary from moment to moment and situation to situation, and vary from individual to individual within the system. You may be male one minute, not male the next, male again tomorrow, and so on. You might be female to a shopping site but male to a social-media platform in another tab. Cheney-Lippold argues that in data sets, "gender becomes a vector, a completely digital and math-based association that defines the meaning of maleness, femaleness, or whatever other gender (or category) a marketer requires." So what gender you are assigned within a particular system may depend on what that system wants to sell you.
Algorithms can be seen to hold the secret truth of gender. Feminist scholars have long held that gender norms are constructed politically and then naturalized with reference to biological sex. "Maleness" or "femaleness" are contested sites with high political stakes, as Joan Scott argues. Algorithms enter that contested field and offer a positivistic solution, using data to rationalize and "prove" which gendered stereotypes should be regarded as "true."
Then we may, for instance, strive to perfect our "femaleness" according to what a platform’s copious data “proves” it should be. To pursue that perfection, we will consent to continually interact with that system, give it more information, submit to its tests. The more we think we know what our gender "should" be, the more we may obey the system that purports to be able to confirm it.
Algorithms can make gender quantitatively comparative within a system, with different individuals achieving different percentages of "absolute maleness" or "absolute femaleness" according to how the system defines these traits. Those definitions would likely be determined by patterns in the data, inflected by whatever biases or self-interested aims the algorithms' programmers encode within them. Marketers could use these kinds of calculations to play on the internalized prejudices and overinvestment that consumers have in their self-identified gender, spelling out the behavior that can make men more manly, and so on.
Gender stereotypes derived algorithmically as seemingly objective truths can then be deployed tactically within platforms (like Facebook, for example) that mirror the self back to users, to trouble their sense of identity in any number of ways. This may be added to a platform's suite of services for their advertisers ( e.g., Destabilize the population of users who score as "75% male" with loops of material that tends to make users' behavior became even more "female" to the algorithm). The actual fluidity of gendered behavior from the point of view of the algorithm can be played off the rigidity of self-assigned gender identifications, for commercial purposes.
Data platforms need not ascribe gender as an either-or, but instead as a place within a matrix, an intersecting point on many different continuums. It can be endlessly nuanced with subclassification, each with its distinct set of data patterns. If we let algorithms deduce our gender, we may be open to letting gender be something other than binary.
When the way we are targeted commercially within a consumerist society no longer relies on gender as such, but on more refined clusters of behaviors, gender may lose some of its relevance as a social category. But gender's long history of structuring political conflicts and power differentials makes this unlikely. Instead, algorithms may be used to simultaneously destabilize gender norms and establish their supposed objectivity, their inevitable "reality" in the social world. Efforts to conform to gender norms can thereby become a never-ending project, demanding a discipline useful to social-control efforts, as well as yielding labor that can be harvested in various ways commercially.
To recap: The full range of our identity markers can be algorithmically reconstituted depending on our context; our imputed identity, even in terms of any of the familiar and seemingly fixed categories (gender, class, race, religious affiliation, etc.), can change from site to site, depending on the data set they draw from.
Our data is reprocessed from moment to moment, positing a different self for us to inhabit and imposing a different set of prejudices on us. Trying to wrest control of these from within the system only refines the data by which the process is implemented. As Cheney-Lippold argues, “We are effectively losing control in defining who we are online, or more specifically we are losing ownership over the meaning of the categories that constitute our identities.” But we never had ownership of these social facts in the first place.
The more we insist we know ourselves independent of the social contexts in which we operate, the more algorithms can manipulate us, promising us a set of tools to reshape how we are seen into what we think people should see. Recasting the self as data offers the illusion of control over that data, but also endorses the way algorithms parse it — something more opaque than even how other people interpret and categorize us.
An alternative is to embrace the way identities are algorithmically imposed on is, consume our “selves” as perpetually novel, ultra-personalized consumer goods. One can accept the ready pleasure of consumerism rather than pursue the freedom of autonomy, which is always imperfect and requires boundless innovation in our techniques of resistance.