What If Your Struggles Actually Make Sense? ADHD & Autism

There is a particular kind of unease that arises when we encounter statistics without context, numbers that seem to climb too quickly, headlines that frame difference as danger, language that turns lived experience into something urgent and alarming. In recent years, autism has increasingly been placed within this narrative. The rising rates of diagnosis are often interpreted as evidence of something going wrong, as though we are witnessing the spread of a condition rather than the unfolding of understanding. But this interpretation rests on an assumption that deserves to be questioned. What if the numbers are not telling us that autism is increasing, but that our capacity to recognize it is finally catching up to reality? And perhaps more unsettling still, what if the deeper issue is not the rise in autism, but the narrowness of what we have historically defined as normal?

Autism is not new. Recognition is. The traits we now describe as autistic, deep focus, heightened sensory awareness, pattern recognition, differences in social communication, have existed across human history, long before they were named, categorized, or pathologized. Some researchers have even suggested that these traits may have played meaningful roles in early human survival and expression, pointing to the intricate, layered detail of Paleolithic cave art as a possible reflection of autistic perceptual styles (Humphrey, 1998; Spikins et al., 2018). Whether or not these interpretations can be definitively proven, they gesture toward something important: neurodivergent ways of thinking are not recent anomalies. They are part of the human story.

What has changed is not the presence of autism, but the frameworks through which we understand it. For much of the twentieth century, autism was narrowly defined, often conflated with schizophrenia, and primarily identified in young white boys who displayed overt and externalized traits. Entire groups of people were left out of this picture, girls and women whose differences were more internalized, individuals from racialized communities who faced barriers to access and referral, adults who had learned to mask and compensate, and those whose autistic traits were obscured by co-occurring conditions such as ADHD, anxiety, or trauma. These individuals were not less autistic. They were simply less visible to a system that did not yet know how to see them.

Over time, diagnostic criteria expanded. Autism was formally recognized as its own category in 1980, broadened in the 1990s to include a wider range of presentations, and eventually reframed as a spectrum in 2013. Screening practices improved. Awareness increased. Schools and healthcare systems began to require formal diagnoses for access to support. The result was a steady rise in identification. But this rise did not reflect the sudden emergence of autism; it reflected the gradual dismantling of a narrow lens. When more people are included in the definition, more people are recognized. What looks like an increase in prevalence is, in many ways, an increase in visibility.

At the same time, something else was happening outside of clinical settings. The internet created spaces where neurodivergent individuals could describe their experiences in their own words, without being filtered through diagnostic language. For many, this was the first time they encountered narratives that resonated with their internal world. It was not a checklist that led to recognition, but a sense of being seen. Posts about sensory overwhelm, about scripting conversations, about the exhaustion of masking and the confusion of always feeling out of step, these became mirrors. And in those mirrors, people began to recognize themselves. This collective self-recognition has contributed to the rise in adult diagnoses, but it has also done something more profound. It has shifted autism from a purely clinical category into a lived, shared, and culturally articulated experience.

What begins to emerge, then, is not simply a story about increasing recognition, but the edges of a deeper and more unsettling question, one that sits just beneath the surface of the data, rarely asked directly, yet quietly shaping the entire conversation. As diagnosis rates continue to rise, more adults identified across the lifespan, more individuals in their forties, fifties, and beyond finally receiving language for experiences that once felt inexplicable, the dominant narrative has been one of improved detection. Better criteria. Greater awareness. Reduced stigma. And all of that is, in many ways, true. But there is another interpretation running underneath it, one that the medical and scientific communities have historically been less comfortable examining. What if we’re not the deviation? What if neurodivergence is not a divergence from the human baseline at all, but a reflection of it?

To consider this, we have to step outside the immediacy of modern life and look at the broader arc of human history. Anatomically modern humans have existed for approximately 300,000 years. The agricultural revolution, when humans began to settle, structure, and standardize life, emerged roughly 12,000 years ago. The industrial revolution, which introduced repetitive labour and mechanized productivity, is only about 250 years old. And the environments that now define daily functioning, office work, formal schooling, prolonged sedentary attention to low-stimulation tasks, are, at most, 150 years old.

Against that timeline, the traits we categorize as neurodivergent begin to look different. What we call ADHD includes constant environmental scanning, rapid pattern recognition across multiple inputs, sensitivity to social and threat cues, and the ability to hyperfocus in high-stimulus contexts. In a dynamic, unpredictable environment, these are not impairments, they are adaptive responses. What we call autism includes deep systematic thinking, heightened sensory acuity, exceptional attention to detail, and resistance to arbitrary or illogical social rules. In an environment without artificial lighting, constant noise, or rigid social hierarchies, these traits are not overwhelming, they are attuned. Dyslexia brings visual-spatial reasoning, the ability to process wholes before parts, and strengths in navigating complex, three-dimensional environments. Dyspraxia and proprioceptive sensitivity reflect a nervous system deeply connected to movement, terrain, and spatial awareness, one that continuously calibrates in response to the physical world.

Within a pre-industrial context, these are not isolated deficits. They form a cognitive ecosystem, a distributed range of perceptual and processing styles that, collectively, allow a group to respond more effectively to a complex and changing environment. In contrast, the traits we often consider “neurotypical”, comfort with routine, sustained attention to repetitive tasks, tolerance for low-stimulation environments, and adherence to structured systems, align closely with the demands of modern industrial and post-industrial life. These traits function well within systems that prioritize consistency, predictability, and control. But those systems are new.

Which raises a question that is less comfortable, but increasingly difficult to ignore. If the environments we have built are narrow, and the range of human cognition is broad, then what we are witnessing may not be the emergence of disorder, but the exposure of mismatch. The challenges so many neurodivergent individuals experience are real and often profound, but they do not exist in a vacuum. They arise in the space between a nervous system and an environment that was never designed to hold it.

From this perspective, the rising rates of diagnosis begin to take on a different meaning. They are not simply evidence that more people are becoming autistic. They may be evidence that more people are struggling within systems that accommodate only a limited range of cognitive styles, and that more people are finding language to describe that struggle. They reflect a growing awareness that what was once interpreted as personal failure may, in fact, be structural misalignment.

There is also a quieter, often overlooked factor contributing to this shift: continuity across generations. Autism is heritable, not through a single gene, but through constellations of traits that run in families. As more children are identified, parents often begin to recognize similar patterns in themselves. What once appeared as isolated cases begins to reveal itself as a lineage. This is not the spread of a condition, but the recognition of something that has always been there, moving through generations without a name.

When we bring all of these pieces together, the expansion of diagnostic criteria, the increase in screening, the recognition of previously overlooked groups, the cultural impact of self-identification, the possibility of environmental mismatch, and the intergenerational nature of neurodivergence, the narrative of an “autism epidemic” becomes difficult to sustain. The language of epidemic implies contagion, crisis, something spreading uncontrollably. But autism does not spread in this way. What is spreading is awareness. What is increasing is recognition. What is shifting is our understanding of human variation.

This does not mean that the challenges associated with autism are insignificant or that support is unnecessary. Many autistic individuals experience real and meaningful difficulties, particularly in environments that are not designed with their needs in mind. But framing autism as a crisis to be solved directs attention toward eliminating difference rather than understanding it. It locates the problem within the individual rather than within the systems that fail to accommodate diversity.

If we move away from the language of crisis, a different question begins to emerge. Not how do we stop autism, but how do we create environments that can hold a wider range of human minds. How do we design systems that are flexible enough to support different ways of processing, communicating, and relating. How do we shift from a model that demands conformity to one that values variation.

When we return to the numbers, one in 150, then one in 54, now one in 36, they no longer read as an alarm. They read as a story. A story of people who were once invisible becoming visible. A story of science revising itself in the face of complexity. A story of individuals finding language for experiences they have carried for a lifetime. And perhaps most importantly, a story of recognition, not just by clinicians, but by people recognizing themselves and each other.

The numbers do not signal a breakdown. They mark a turning point.

Next
Next

Autistic Burnout: When the Nervous System Can No Longer Adapt