On the fractal nature of complex syntax and the timescale problem

Fundamental to complex dynamic systems theory is the assumption that the recursive behavior of complex systems results in the generation of physical forms and dynamic processes that are self-similar and scale-invariant. Such fractal-like structures and the organismic benefit that they engender has been widely noted in physiology, biology, and medicine, yet discussions of the fractal-like nature of language have remained at the level of metaphor in applied linguistics. Motivated by the lack of empirical evidence supporting this assumption, the present study examines the extent to which the use and development of complex syntax in a learner of English as a second language demonstrate the characteristics of self-similarity and scale invariance at nested timescales. Findings suggest that the use and development of syntactic complexity are governed by fractal scaling as the dynamic relationship among the subconstructs of syntax maintain their complexity and variability across multiple temporal scales. Overall, fractal analysis appears to be a fruitful analytic tool when attempting to discern the dynamic relationships among the multiple component parts of complex systems as they interact over time.


Introduction
It is no secret that recursive, self-similar patterns abound in the world that surrounds us. From the biological self-similarity evidenced, for example, in the vascular branching of our internal organs, to the self-similar patterning of abiotic structures including mountain ranges, river systems, clouds, and coastlines, the natural world flourishes in recursive iteration. Such self-similarity, as we learn from fields such as physics and mathematics, results from the behavior of complex dynamic systems as they iterate over time (Brown, Gupta, Li, Milne, & West, 2002). The feedback-sensitive nature of such systems, in which the current state of the system serves as the input for each subsequent state (Thelen, 2005), engenders the self-similar patterns so frequently witnessed in nature. To be sure, recognition of the self-similar patterning so common in the natural world is not new; however, it was not until Benoit Mandelbrot (1982) advanced the notion of fractal geometry that a language emerged capable of examining and describing the traces left by the recursive iteration of complex systems (Peitgen, Jürgens, & Saupe, 1992). Across both time and space, complex systems in the natural world commonly generate fractal-like structures, thus upholding their intrinsic complexity at smaller and smaller temporal and physical scales.
As Mandelbrot articulated, the term fractal refers to a class of geometric figures that, when magnified, reveal increasingly smaller, yet self-similar structures on every scale. Fractal scaling is pervasive in nature and has been noted in a myriad of physical and biological processes. Language, of course, has not escaped these discussions as scholars have alluded to the fractal nature of language for some time (Lyle, 2009;Shanon, 1993;Youmans & Pfeifer, 2005). This discussion, however, has been limited, relying almost exclusively on the fractal as a metaphor to describe and explain language as the requisite characteristics of fractals offer a distinct semblance to many of the defining features of human language (Pareyon, 2007). Clear reference has been made, for instance, to the oft-cited and widely recognized recursive nature of language. In this way, the iterative combinatorial process of language production results in a branching of structures that, when examined independently, frequently resemble the whole. Fruitful as the fractal metaphor may be, the extent to which fractal patterns appear in language use is still uncertain (Larsen-Freeman & Cameron, 2008). Given this uncertainty, if language is to be accepted as fractal-like, there is a clear need for empirical evidence supporting the self-similar, scale-free nature of language use and development. Furthermore, as advances in disciplines such as biology and physiology have uncovered the pervasiveness of fractal scaling in living organisms (Brown et al., 2002), our understanding of the organismic benefit derived from fractal scaling has continued to grow. If language use and development can be demonstrated to exhibit fractal patterning, logically, the biological benefits of fractality may equally be shared by the human language capacity.
Given the pervasiveness of self-similarity in the physical world, the present paper empirically explores the fractal nature of language, focusing specifically on the temporal self-similarity of complex syntax. To accomplish this task, I draw on longitudinal oral production data of a second language (L2) learner of English to show that complex syntax evidences the defining characteristics of fractal structure -that of self-similarity and scale-invariance, along with a nonlinear distribution of form. In doing so, this study adds to the previous body of literature discussing the nonlinear, Zipfian distributions of lexis (Larsen-Freeman, 1997;Larsen-Freeman & Cameron, 2008) and lexical variation (Kretzschmar, 2015) by arguing that complex syntax, too, appears to be governed by the principles of scale-free, fractal fluctuation. As arguments for the biological benefit of fractal scaling in human physiology have gained traction in recent years (West, 2006), I advance this conjecture vis-à-vis language use and development in a discussion of the potential benefits of the fractal nature of complex syntax. Finally, recent complexity scholarship has affirmed that the components and subcomponents of the complex linguistic system develop at highly differential rates over time (de Bot, 2015), yet the true timescales of development remain elusive (Ruhland & van Geert, 1998). It follows, then, that this imperfectly understood area of language development hints at a broader issue within the field of applied linguistics, one which I term the timescale problem. Given that research into fractal scaling in language is in its infancy, I posit the latent potential of fractal analysis to address the timescale problem and our impoverished understanding of the unique timescales of language development. Adopting a fractal analytical framework, I will argue, accommodates a view in which language is seen not as an artifact of a single arbitrary timescale, but rather as a dynamic process situated within and across multiple scales of time.

Complex dynamic systems theory (CDST)
Since its introduction into the field of applied linguistics (Larsen-Freeman, 1997), CDST has gained increasing favorability. First understood as a productive metaphor to describe and explain language development and use (Larsen-Freeman & Cameron, 2008), in its application, complexity-informed scholarship has demonstrated that language exhibits many of the fundamental properties of complex systems. Relating both to the structural and behavioral characteristics of such systems, we now understand language to consist of a multitude of interacting components and subcomponents (van Geert, 2008) whose open, feedback-sensitive nature leads to seemingly random, nonlinear trajectories of development over time (e.g., Dong, 2016;Schmid, Kopke, & de Bot, 2013).
Complex systems are comprised of a theoretically limitless set of interrelated component parts which may be both endogenous or exogenous to the system itself (de Bot & Larsen-Freeman, 2011). Few would argue the applicability of this notion to the language faculty as both system-internal and system-external factors are recognized to affect language development and use. From within, psychological constructs and language skills such as mental grammars, phonology, and the lexicon all interface with the external linguistic environment through distributed social interaction (Lowie & Verspoor, 2015). As individuals differ significantly in terms of biological propensity as well as experience, the unique ways in which the components of the complex linguistic system interact lead to highly individualized developmental trajectories. As a result, complexity studies have highlighted such inter-individual variability (Baba & Nitta, 2014;van Dijk & van Geert, 2007), thus demonstrating that group means tell us very little about how individual language users develop in their abilities over time (Chan, Verspoor, & Vahtrick, 2015;Larsen-Freeman, 2006;Verspoor, Lowie, Chan, & Vahtrick, 2017).
Equally relevant to human language, complex systems are said to be open, or dissipative (Prigogine & Stengers, 1984), in that they find themselves in a constant exchange of energy and information with the external environment (Kauffman, 1995). Coupled with the curious interplay that results among the manifold components of the system, this continuous exchange gives rise to a complex linguistic system that is inherently mutable, yet atelic as it possesses no discernable end state (de Bot, Chan, Lowie, Plat, & Verspoor, 2012;Larsen-Freeman, 2015). External pressures are a driving force in system evolution, yet change also initiates from within. Internal pressures exerted on the system motivate the process of self-organization by which the synergistic activity of system-internal components generates novelty in behavior (Smith & Thelen, 2003), pushing the system toward higher orders of complexity and spontaneous pattern formation (van Geert, 2008). As complex systems respond and adapt to internal and external pressures, variability is common as systems evolve over time (van Dijk & van Geert, 2011). This notion has been demonstrated in both first language (L1; Bassano & van Geert, 2007) and L2 development (Lowie, Verspoor, & van Dijk, 2018;Penris & Verspoor, 2017), leading some to suggest that variability not only promotes flexibility within the system but may truly be a prerequisite to development (Lowie, Verspoor, & van Dijk, 2018).
The feedback sensitivity of complex systems equally contributes to the variability in system development over time. Dissipative systems do not passively exchange information with their surroundings; instead, they actively gather information about themselves and their environment (Gell-Mann, 1994) with each iterative exchange serving as an attempt to further benefit the system (Larsen-Freeman, 1997). As a consequence, feedback-sensitive systems evolve via an iterative process by which the current state, or iteration, of the system serves as input for the next proximal state (Thelen, 2005). This view is highly amenable to emergentist accounts of language use and development which argue that grammatical regularities emerge from the interaction of the components both internal and external to the complex linguistic system itself (Ellis, 2011). In real time, the complex linguistic system adapts to the moment-to-moment constraints and affordances of the context in which it is used. In turn, these exchanges reinforce certain stable conditions, or attractor states (Hiver, 2015), while equally pushing the system away from others (van Geert, 2008).
Recent complexity scholarship has shaped our understanding of language as an open, feedback-sensitive, nonlinear, complex, adaptive system (Beckner et al., 2009). Significant questions remain, however, relating to how language development and use uphold additional characteristics of complex systems. We know, for instance, that the self-organization of complex systems as they iterate over time leads to the emergence of novel patterns of form and behavior (Evans & Larsen-Freeman, 2020). Fields such as mathematics and physics have demonstrated that this process of iteration frequently generates fractal-like structures, that is, those which mimic themselves on smaller and smaller scales (Feldman, 2012). Though complexity-informed scholarship in applied linguistics has hinted at the fractal nature of language (Larsen-Freeman & Cameron, 2008), the extent to which these forms exist is still uncertain.

Fractals
Fractals are those geometric forms which, when magnified, reveal increasingly smaller, yet self-similar patterns across all scales of measurement. A product of complex dynamic systems (Delignières & Marmelat, 2012), geometric fractals result from iterative mathematical equations that, when repeated, are capable of expressing an infinite amount of detail in a seemingly finite space (Chávez & Ragan, 2016). This process is demonstrated, for example, in the often cited Sierpiński triangle. The Sierpiński triangle is created via a recursive process in which the central triangle within an equilateral triangle is removed leaving four similar triangles. When this process is repeated iteratively with the remaining triangles, a fractal figure emerges (see Figure 1). The Sierpiński triangle illustrates two of the defining characteristics of geometric fractals -those of self-similarity and scale-invariance. When a segment of the Sierpiński triangle is magnified, the details revealed are identical, or self-similar, to those of the entire figure. As this process may be repeated endlessly with similar results, the Sierpiński triangle equally illustrates scale-invariance as each iteration produces equal amounts of complex detail down to the most arbitrarily small scales.

Figure 1
The Sierpiński triangle Nature makes use of the efficiency of scale-free self-similarity as the iteration of simple rules makes the generation of fractals relatively "easy" (Feldman, 2012). Fractal structures found in nature, however, seldom display self-similarity with the same exactitude as their mathematical brethren due, in significant part, to the inherent randomness of natural processes (McGrath, 2016). While geometric fractals demonstrate limitless scale-invariance, true fractals in this sense do not occur in nature. Instead, the stochastic nature of the physical world results in fractal structures that rarely exhibit perfect self-similarity; that is, an isolated segment of an object is most often only an approximation of the whole (Peitgen et al., 1992).

Distribution of form
As demonstrated in the Sierpiński triangle, each successive iteration contains further reduced triangles that resemble the whole. In this way, the distribution of form within fractal structures is a function of the scale of measurement -while only a limited number of large triangles can be discerned, countless smaller and smaller triangles are observed as the shape is magnified. This unique relationship, when plotted graphically, results not in a normal, or Gaussian distribution of form, but in a probability density function governed by a power-law (see Figure 2). Indicative of self-similar, fractal processes (West, 2001), power-laws describe scaling relationships in which each successively smaller structure is exponentially more frequent than that which precedes it (Feldman, 2012). In the case of the Sierpiński triangle, a large probability exists of observing a small triangle, yet the probability of observing a large triangle is exponentially diminished.

Figure 2
The asymptotic, long-tail curve of a power-law distribution The asymptotic, long-tail distribution of power-laws embodies the scale-free nature of fractals as the "average" form depends not on the traditional bell curve, but rather on the degree of resolution used in observation (Feldman, 2012). Thus, power-law probability density functions challenge the commonly held assumption in science and mathematics that measurement on a single scale results in an appropriate description of a fractal object (Liebovitch & Shehadeh, 2005). Alternatively, investigating fractal forms necessarily warrants measurement across scales as scaling relationships provide significant insight into the patterns found in nature.

Self-similar structure versus self-similar dynamics
While physical, or geometric, self-similarity is highly prevalent in nature and mathematics, fractal scaling is equally observed in the temporal dynamics, or behavior, of complex systems as they iterate over time. Whereas fractal geometry describes selfsimilarity in physical form, in the temporal dimension, self-similarity is an expression of the system's dynamics (West, 2001) and is suggestive of a process in which fluctuations in behavior appear identical at increasingly smaller moments in time. Certainly, all biological processes, including language, exhibit a certain degree of variability in their developmental trajectories. When examined at successively smaller time scales, patterns of variability frequently appear self-similar across temporal scales (Rhea et al., 2014). In this way, as geometric fractals maintain their complexity across physical scales, a system's dynamics do not diminish in complexity as temporal scales are magnified. Instead, the moment-to-moment fluctuations of complex systems retain their complexity as the system "dances among the time scales" (West, 2001, p. 397). It is the "dance" among time scales and the emergent patterns that it produces that is of most interest to those seeking to understand the fractal nature of behavior.
In sum, power-law scaling relationships and fractal self-similarity are recognized as a robust analytic framework for investigating the emergence and behavior of complex systems (Brown et al., 2002). As power-law probability density functions are the "signature" of fractals (Feldman, 2012), evidence of these distributions supports the fractal nature of complex biological processes. Fields such as physiology and medicine have capitalized on this understanding leading to new perspectives on health (West, 2006), advancing the importance of fractal scaling in the human organism.

Fractal patterns in human physiology
The pervasiveness of fractal scaling in human behavior and physiology is widely acknowledged (Stephen, Arzamarski, & Michaels, 2010), and such patterning is now recognized as an important characteristic of a "healthy" biological system. Our emerging understanding of fractal scaling now suggests that health is often incorrectly attributed to "order" within the physiological system (Varela, Ruiz-Esteban, & Mestre de Juan, 2010). On the contrary, living systems are rarely in states of equilibrium, and "most healthy behaviors are highly irregular and pseudo-random" (Varela et al., 2010, p. 585). Thus, while pathology was once understood as chaos and disorder within a physiological system, in many ways it is now viewed as a decrease in the patterns of self-similarity in aging or diseased systems (Delignières & Marmelat, 2012;Rhea & Kuznetsov, 2017).
Not surprisingly, physiological systems derive benefit from their fractal characteristics, particularly when considering the adaptability and/or flexibility afforded by scale-invariant self-similarity. From an evolutionary perspective, the redundancy of fractal systems at every timescale permits organisms to overcome and adapt to both the stress and unpredictability of environments (McGrath, 2016) -on the order of milliseconds in cellular functions to the entire lifespan, and perhaps on even greater timescales such as that of phylogeny (Potts, 1996). Outwardly, self-similarity equates to fluctuations in performance resulting in manifold behavioral assemblies allowing biological systems to be "optimally sensitive" to environmental change (Stephen et al., 2010). The flexibility and adaptability of human behavior to the perennial dynamism of our environment has been heralded as perhaps the defining characteristic of the genus Homo (Potts, 1996).
Yet the fitness of our phenotypes is hardly the sole beneficiary of fractal scaling as, inwardly, the human cognitive function equally derives benefit from self-similarity. As fractals are constructed via iterative processes in which simple rules engender self-similar structures at various scales, the cognitive load required to generate and process multiple hierarchical levels of self-similarity is reduced (Fischmeister, Martins, Beisteiner, & Fitch, 2017). Representation of multiple hierarchical levels by a single rule may reduce cognitive demands (Dias Martins, Laaha, Freiberger, Choi, & Fitch, 2014) while equally aiding in parsing complex, recursive structures (Fischmeister et al., 2017), hierarchical units such as phonemes, syllables, and sentences (Ding, Melloni, Zhang, Tiang, & Poeppel, 2016), and even lexical processing (Plat, Lowie, & de Bot, 2018). Taken further, others have suggested that self-similarity and recursion are more than simply "hallmark traits" of cognitive processing; instead, these may fully equate to the default mode of the human cognitive function (Fischmeister et al., 2017). In human physiology and behavior, fractal scaling appears to be the norm and not the exception.

Fractal scaling in the study of language
The potential of fractal analysis has not gone unnoticed in the study of language. Clearly, the iterative patterning of fractal forms evokes a distinct parallel to the recursive nature of human language (Chomsky, 1957). Recursion, by definition, is iterative as it "takes a finite set of elements and yields a potentially infinite array of discrete expressions" (Hauser, Chomsky, & Fitch, 2002, p. 1571. Iteration in language is evidenced, for example, in the generation of syntactic trees as the hierarchical branching of constituents, when isolated, frequently resembles the tree diagram in its entirety. Generally speaking, however, these discussions have stemmed from synchronic representations of recursion at the level of the utterance, that is, at a single temporal scale. This representation, while productive, fails to account for the diachronic iteration of language over temporal scales beyond the utterance and, from a complexity standpoint, for the ways in which such recursion may lead to language development over time. By contrast, scholars attempting to situate the recursive, nested patterns of language use across multiple timescales have drawn on the fractal metaphor to describe phenomena such as the acquisition of French tense and aspect (Lyle, 2009) and the "ebb and flow of information" in discourse (Youmans, 1991;Youmans & Pfeifer, 2005). In this way, Lyle (2009) argues for a fractal model of L2 development in which learners soft assemble (Thelen & Smith, 1994) their language resources in response to the constraints of the linguistic environment. This occurs across multiple levels of analysis as a "narrower focus reveals equally complex but perhaps qualitatively different patterns when compared to a wider angle" (Lyle, 2009, p. 61), yet the extent to which these patterns occur in the acquisition of tense-aspect morphology at multiple temporal scales is still unclear.
Others have argued that power-law distributions of lexis (Larsen-Freeman & Cameron, 2008) and lexical variation (Kretzschmar, 2015) provide the most compelling evidence for language as a complex system. Linguist George Kingsley Zipf (1935) first noted that, given a text of sufficient size, the probability of the most common word occurring in the text is inversely proportional to its rank. Thus, language use is characterized by a small number of highly recurrent words, and a reduced number of those which are less common. This phenomenon occurs not just in the distribution of words within a text (i.e., Zipf's law), but equally in lexical variation as the most common semantic and phonological variants of a particular word occur with exponentially greater frequency than the least common variants (Kretzschmar, 2015). The longstanding recognition of the nonlinear distribution of the lexicon has motivated Larsen-Freeman and Cameron (2008) to argue that "conceivably there are fractals in other language-using patterns" (p. 111) ostensibly extending beyond just the lexicon. Indeed, empirical evidence of power-law distributions in other language patterns would further support the fractal nature of language.
Perhaps the most credible empirical evidence of the self-similar dynamics of discourse is offered by Youmans (1991). In an attempt to discern the distribution of given versus new information within a text, Youmans followed a simple algorithm to assign values of 1.0 to words appearing for the first time and 0.0 to repetitions, that is, new versus given information respectively. Next, moving averages of the values were plotted, generating vocabulary-management profiles (VMPs). As new vocabulary correlates with new information and repeated vocabulary with given information, the increasing and decreasing VMP trendlines illustrate the ebb and flow of information in discourse. Upon closer inspection, the VMP profiles revealed fractal patterning as givenness cycles throughout discourse with smaller peaks and valleys of given versus new information at the sentential level nested within larger peaks and valleys across paragraphs, sections, episodes, and so on. The nested dynamics of givenness offered by the VMP profiles offer distinct evidence of a linguistic system in which self-similar patterns of variability retain their complexity regardless of the scale of measurement.
In sum, fractal analysis allows us to observe and describe the patterns of complex dynamic systems as they iterate over time across multiple timescales (McGrath, 2016). As we are still somewhat eluded by the true timescales of linguistic development (de Bot, 2015;Ruhland & van Geert, 1998), viewing emerging patterns in this way holds great potential for research into the human language capacity. This potential, coupled with the relative paucity of empirical studies specifically examining the fractal nature of language, speaks to the urgency of work in this area. As such, the present study responds directly to these issues and was informed by the following research question: · In what way(s) do the developmental trajectories of complex syntax exhibit the characteristics of self-similarity and scale invariance typical of fractal scaling over time?

The study
Upholding the preference for process-oriented data collection and analysis in CDST, this study employed a nonlinear time series analytic framework (Myers, 2016). Time series analyses rely on consecutive, equally spaced observations of a given phenomenon for a specified period of time. Once a longitudinal dataset is compiled, time series designs then allow for the full dataset to be represented graphically in a time series plot (Hiver & Al-Hoorie, 2020), or scrutinized via CDST-informed analyses such as min-max graphs, Monte Carlo simulations, and moving correlations (Verspoor, de Bot, & Lowie, 2011).

Participant
The participant, Alceste, 1 was an adult male learner of English as a second language who worked as an instructor of French for one academic year in a large, public university in the northeastern United States. Coming from the francophone region of Switzerland, Alceste reported only two years of formal education in English at the secondary level and suggested that his linguistic ability in English had developed through interaction with English-speaking friends in Switzerland. Alceste was 27 years old upon commencement of data collection and his oral proficiency in English was informally assessed at the intermediate low level (ACTFL, 2012), an assumption affirmed by a colleague with extensive knowledge of L2 assessment. During his time in the United States, Alceste frequently sought opportunities to interact with speakers of English and communicated a desire to improve his linguistic abilities, though he did not enroll in any formal instruction in ESL. Consistent with other literature in SLA (e.g., Polat & Kim, 2014), Alceste is considered an untutored learner of English given his lack of exposure to the instructed setting. Although he was cognizant that my objective was to study the development in his language proficiency, he remained unaware of any specific focal areas of study.

Data collection
Dialogic conversations between Alceste and myself were audio recorded and transcribed at weekly intervals over the course of one academic year for a total of 30 conversations. To uphold the most naturalistic setting possible, conversations were not scripted in any way, although I frequently began by asking Alceste about his week and inquiring as to the opportunities that he had had to engage in interaction in English. Recurrent conversation themes included his position as an instructor of French, his interest in French literature, cultural differences between Switzerland and the United States, and the like. Weekly conversations were held for a minimum of 20 minutes, although the desire for prolonged conversation was never impeded. To maintain consistency, I limited the analysis of each conversation to the first 20 minutes.

Unit of analysis and syntactic complexity
Dialogic conversation data were first segmented into analysis of speech units (ASunits;Foster, Tonkyn, & Wigglesworth, 2000), defined minimally as any "independent clause, or sub-clausal unit, together with any subordinate clause(s) associated with either" (p. 365). Unlike other units of analysis employed in text/discourse studies, the AS-unit allows for the inclusion of sub-clausal units in analysis, or those utterances that do not necessarily contain a finite verb (and/or other syntactic elements) yet figure prominently in dialogic oral data (Norris & Ortega, 2009). A methodological note is warranted, however, as few studies employing the AS-unit explicitly discuss the level of application adopted for analysis. In this study, I employed the AS-unit at the second level given the highly interactive nature of the dialogic conversation task. At this level, one-word minor utterances (e.g., yeah) and verbatim echoic responses are excluded from analysis. The operational definitions of syntactic complexity were consistent with much of the extant literature (Bulté & Housen, 2012;Norris & Ortega, 2009). In this way, complex syntax was viewed at the clausal, supra-clausal, and sentential levels. Clausal complexity was calculated as the mean length of clause in words; supra-clausal complexity as the total number of clauses per AS-unit via the mechanism of subordination; and sentential complexity was calculated as the mean length of AS-unit in words.

Data analysis
Notably, there is no "cookbook" approach to fractal analysis (McGrath, 2016) as each distinct field has employed numerous tools to analyze and interpret fractal scaling given the respective data at hand. Thus, I sought to determine the fractal-like nature of language in two distinct ways -via the presence of power-law probability density functions and by examining the self-similar temporal dynamics of the system.
To determine the probability density distribution of complex syntax in conversation, I first calculated the number of clauses per individual AS-unit over the 30 weeks of data collection, which allowed me to determine the total number of AS-units with a density of one clause (e.g., one independent clause), two clauses (e.g., one independent plus one subordinate clause), three clauses (e.g., one independent plus two subordinate clauses), and so forth. Finally, I plotted the results of this analysis graphically with frequency of occurrence on the y axis and clausal density on the x axis to identify the extent to which a power-law probability density function governs the distribution of complex syntax.
Next, to examine self-similar dynamics, raw data for clausal, supra-clausal, and sentential complexification were standardized and the z-scores were plotted graphically with time in weeks on the x axis. Similar to other complexity studies (e.g., Larsen-Freeman, 2006), plotting the standardized scores in this way allowed the three developmental trajectories to be scrutinized comparatively. To address the potential self-similarity in the dynamic interactions among the subconstructs of complex syntax, the relationships between these measures were plotted and examined at three distinct temporal levels. This included the successive magnification of the conversation data from the level of the entire dataset (30 weeks), to 30 conversation turns within one week, and finally to 28 AS-units within a single conversation turn as 28 was the maximum number of contiguous AS-units within a single turn. The scaling of this analysis was chosen specifically to reflect the possible adaptations of complex syntax over the various legitimate timescales of language use and development -within one conversation turn, across a single conversation, and finally over one academic year.
Typical of nonlinear time series analyses, inspection of the phase space plots for each successive temporal level visually indicated self-similar patterns in variability across timescales. To confirm this finding, the percentage of total variance of each of the three constructs of complex syntax (clausal, supra-clausal, and sentential complexification) was calculated in Excel at each of the three distinct timescales. If complex syntax is indeed fractal-like, the percentage that each construct accounts for the total variability would ostensibly remain equal across temporal scales.

Power-law probability density function
In total, Alceste produced just less than 5,900 clauses during our weekly conservations over the academic year. Frequency/density calculations of clausal production were plotted graphically to illustrate the probability density function of their distribution. The results of this analysis across 30 weeks of conversation data is shown in Figure 3. Clearly, the long-tail, asymptotic curve of a power-law distribution is evident in the visualization. This is suggestive of a rapidly attenuating distribution of clausal density in which, over a prolonged period, a speaker in dialogic interaction demonstrates a preference for single-clause AS-units at a rate that is exponentially greater than the preference for two-clause, three-clause, or greater AS-units. While longer AS-units were indeed produced throughout the academic year (with a 7-clause AS-unit occurring just twice in the data), the occurrence of these longer units was rare.

Figure 3
The frequency/density distribution of clausal density over 30 weeks The distribution in Figure 3 points to the constraints and affordances of the external linguistic environment and the way in which these interact with the internal language faculty. Clearly permissible are long, dense utterances capable of transmitting vast amounts of information, yet the preference for truncated utterances may be indicative of a desire to minimize processing effort (Filipović & Hawkins, 2013). When necessary, however, the power-law governing the production of complex syntax engenders the requisite flexibility to respond to the demands of language in use, thus empowering the speaker to convey even the most highly intricate information.

Self-similarity across temporal scales
The dynamic self-similarity among the constructs of clausal, supra-clausal, and sentential elaboration across three temporal scales can be seen in Figures 4, 5, and 6.

Figure 4
The dynamics of syntactic complexity over 30 weeks (z-scores) Figure 4 illustrates the nonlinear nature of Alceste's complex syntax over 30 weeks. Although no construct demonstrated a marked increase over one academic year, several observations can be made. To begin, we see that overall sentential elaboration appears highly variable over the period of data collection. At times, a degree of relative consistency can be observed (e.g., Weeks 16 through 21), while at others, there are much larger differences between the peaks and valleys of the trajectory (e.g., Weeks 11 to 13). Next, the dynamic relationship between subordination and clausal elaboration suggests that sentence length is, at times, accounted for by clausal elaboration and subordination rather equally (e.g., Weeks 1 through 3), while at other weeks, there is significant disparity between the mechanisms which best account for sentence length (e.g., Week 4).

Figure 5
The dynamics of syntactic complexity over 30 conversation turns (z-scores)

Figure 6
The dynamics of syntactic complexity over 28 AS-units (z-scores) Next, to illustrate the dynamic self-similarity and scale-invariance of complex syntax, Figures 5 and 6 show the trajectories of these constructs at further reduced temporal scales -that of 30 conversation turns within one week and that of 28 contiguous AS-units within a single conversation turn. When viewed together, the variability within the trajectories of subordination, clausal elaboration, and sentential elaboration in the three graphs is strikingly similar at all three time scales. Further supporting the fractal nature of complex syntax, Table 1 illustrates the self-similarity of the variability among the three constructs. In the same way that a smaller and smaller ruler will return increasingly larger measurements of a coastline, total variance predictably differs across the three distinct periods of measurement, yet the percentage of the total variance that each construct accounts for remains the same independent of temporal scale. In this way, complex syntax demonstrates the fractal characteristic of self-similarity in that it equally accounts for total variance regardless of the scale of measurement. Across temporal scales, a complex interplay exists between the subconstructs of syntactic complexity. As utterance length varies, the mechanisms by which this process occurs vacillate between a reliance on subordination and clausal elaboration. At times, these two constructs function in unison to complexify Alceste's speech; at others, they appear to exist in competition. Perhaps most important, however, is the self-similar, scale-free nature of these trajectories as they maintain their nonlinearity at every temporal scale.

Discussion
From a complexity theory perspective, examining the fractal-like nature of linguistic phenomena represents a novel approach to the study of language, a point echoed by Larsen-Freeman (2012) who writes that "complexity theory inspires us to think differently about language and language development" (p. 202). Naturally, different ways of conceptualizing a construct require different ways of approaching that construct both theoretically and methodologically. The fractal analysis detailed herein allows us to do just that. From the data presented above, we see that clausal density in discourse is governed by a powerlaw distribution typical of fractals found in the physical world. In addition, the dynamic patterns of variability of complex syntax appear self-similar across three distinct temporal scales. Taken together, the results of this analysis provide support for the fractal nature of language, which is a key assumption in a complexity-informed perspective (Larsen-Freeman, 1997).
What such fractal scaling suggests is that complex syntax, like many human behaviors, demonstrates scale-invariant temporal flexibility (Delignières & Marmelat, 2012), allowing language to adapt to the needs of the speaker across timescales (see also Lyle, 2009;Shanon, 1993). Within a single conversation turn, the fluctuating variability of sentence length, clause length, and density of subordination permit the speaker to expressly communicate their developing thoughts on a moment-to-moment basis. Across an entire conversation, the same flexibility allows for changes in topic and context, dynamic discourse pragmatics including givenness (Youmans & Pfeifer, 2005), and stylistic choices such as repertoire. The macrogenetic flexibility exhibited at greater timescales (i.e., one academic year or beyond) accommodates the slower dynamics associated with changes in sociolinguistic environment, brain maturation in children, levels of education, interpersonal relationships, and so forth. In sum, it seems that the human language faculty capitalizes on the adaptability and flexibility of fractal scaling in much the same was as many, if not all, living organisms.
As a complexity-informed view of language suggests, learners do not progress from one conceptual level of linguistic ability to another in linear fashion (Larsen-Freeman, 2006). Instead, distinct properties of linguistic proficiency develop recursively according to their own subjective timescales. As most will recognize, many systems may demonstrate change on larger timescales (e.g., pragmatic competence), while others (e.g., pronunciation) may demonstrate much more acute, perhaps discontinuous shifts (de Bot, 2015). Acknowledging the disparate timescales of development is without question a salient contribution of the complexity paradigm, yet developing a more precise understanding of the true timescales of development presents a significant conceptual hurdle (Ruhland & van Geert, 1998). In this way, applied linguistics is currently faced with a timescale problem. As traditional analysis has relied on measurements of language at single temporal scales, viable methods of tracing the development of linguistic constructs across distinct scales of time have remained elusive. The timescale problem, then, relates to the limitation of these methods and the imperfect understanding of the subjective timescales of development that results.
The behavior of complex systems emerges as the result of the synergistic relationship between the components germane to the system. This notion takes the timescale problem a step further. As van Geert (1994) prudently outlines, not only do the components of complex systems develop in a stochastic manner, but the synergistic relationships between these components express equal dynamism over time. Importantly, as the system progresses through periods of self-organization, the dynamic relationship between some system components may both support and/or hinder the development of others. Further still, some relationships may even be precursory in nature given that the emergence of one behavior is predicated on the existence of another . This conjecture begs the question as to the ways in which constructs developing on shorter temporal scales may ultimately influence the development of other constructs over longer periods of time. I argue that fractal analysis, in which trajectories of development are plotted and scrutinized at hierarchically temporal scales, may serve as an effective tool capable of providing insight into the timescale problem.
Examining rates of change (de Bot, 2015) in regard to the myriad timescales of development will prove indispensable in pushing our understanding of the complex linguistic system. The importance of timescales reifies the methodological significance of dense data collection over extended periods of time as consistency in measurement permits the magnification of developmental trajectories to further and further reduced moments in time.

Conclusion
The fractal analysis presented here, though rudimentary, clearly supports the transdisciplinary nature of CDST (Larsen-Freeman, 2012). As this theoretical orientation has hinted at the fractal nature of language (Larsen-Freeman & Cameron, 2008), the self-similarity and scale invariance of syntactic complexity illustrated herein lends support to this notion. The variability of complex syntax as it dances among the timescales is a reminder of the power of the human language faculty and its ability to adapt to the perpetual dynamism of the linguistic environment over time. In sum, fractal analysis aligns with the goals of complexity research in that we seek to "represent and understand specific complex systems at various scales of description" (Hiver & Al-Hoorie, 2016, p. 752). Yet the timescale problem is reflective of the fact that we are still unaware of what scales of description may be most suitable in the study of language development. Examining the fractal selfsimilarity and scale-invariance of language use and development across temporally hierarchical scales may yield significant insight into this phenomenon.
As discussed above, Alceste's use of complex grammar did not experience growth in the customary sense of a marked increase over time. This may be due to the advanced nature of his language before arrival or, equally, to his status as an untutored learner receiving little to no explicit language instruction during his residency (Polat & Kim, 2014). Thus, from this particular dataset, we are unable to glean how these trajectories may have changed, on various timescales, had he arrived with lower levels of proficiency. As one reviewer rightfully pointed out, the lack of growth in the participant's use of complex syntax precludes any claims as to the emergence of fractal dynamics in L2 development. Instead, emphasizing the exploratory nature of this study, the hope is that the results indicate the promising nature of a fractal analytic framework while equally motivating L2 researchers to further address the timescale problem.
Had a discontinuous jump, or phase shift (Baba & Nitta, 2014), occurred in Alceste's data at a certain timescale, say at the level of weeks, this particular transition may have indicated a point at which to zoom in via fractal analysis. When phase shifts are identified in timeseries data, researchers may focus on smaller and smaller temporal scales to potentially identify the mechanics motivating such change. Equally, this process may provide indication as to those particular moments that merit further scrutiny from a qualitative perspective. If discontinuity is detected, for example, within a single conversation turn or across multiple turns within a conversation, more detailed qualitative analysis of those moments may be productive as a way to interpret the relationship between micro and macrogenetic development. Doing so would allow scholars to make claims regarding the situatedness of micro-levels of development as they relate to development across much broader temporal scales. Likewise, as data were collected on a weekly basis, any attempt to view self-similarity could only be completed at one point within one week. Though the opportunity never arose, it would have been prudent, at least once, to collect measurements with even greater levels of density within a single week. This could have been as simple as meeting for perhaps three consecutive days to permit a much more in-depth analysis of self-similarity.
In other disciplines such as physiology, psychology, and medicine, fractal analyses have relied on dense quantitative measurements of behavior and physiological processes facilitated greatly by advances in technology. These have been extended by complex analytic methods such as detrended fluctuation analysis, the box counting dimension, 1/f scaling (Van Orden, Holden, & Turvey, 2003), and so on. As the data and the interpretations presented in this article are exploratory in nature, the application of these tools is suggestive of a logical next step. In applied linguistics, much complexity-informed scholarship has equally relied on dense numerical data analyzed via methods such as Monte Carlo simulations, moving min-max graphs, moving window correlations, and so on. Yet few studies, if any, have generated data points in the tens of thousands necessary for the analytic approaches found in other fields.
Lastly, as this investigation was exploratory in nature, the fractal features of complex syntax were not compared in terms of their relation to development in other areas of the language faculty. Emphasis on the interconnectedness of complex systems points directly to the potential significance of viewing self-similarity in this way. By looking at the dynamic interactions between components across multiple timescales, the changing relationships between constructs may be viewed as both a macro and microgenetic phenomenon.