X-twitter Linkedin Link Link
    fr
    • Awards
    • Ceremony
    • Publications
      • Books in french
      • Books in english
      • African books
    • Chronicles
    • Great economists
    • Partnership
      • ADAE
      • IPM
      • CERFRANCE
      • AF2I
      • SEIN
      • AEFR
      • ILB
    • Contacts
    • Awards
    • Ceremony
    • Publications
      • Books in french
      • Books in english
      • African books
    • Chronicles
    • Great economists
    • Partnership
      • ADAE
      • IPM
      • CERFRANCE
      • AF2I
      • SEIN
      • AEFR
      • ILB
    • Contacts

    Clubturgot.com, 100th edition

    Chroniques

    Jean-Jacques PluchartEditor-in-Chief  ​Clubturgot.com celebrates its 100th issue. ​​Since March 2024, the leading French- and English-language newsletter on economic and financial literature has published over 300 articles, book reviews and tributes to the works of leading economists. ​​Over the weeks, in order to better meet the expectations of their readers, the newsletter’s editors have published their articles first in French and then in English, focusing on the work of theorists and the insights of practitioners in the increasingly numerous and complex fields of economics and finance.​Every week, in just a few minutes, the 30,000 readers of clubturgot.com are thus able to learn about the key economic and financial events of the day. The authors of the articles published on clubturgot.com are Turgot Prize winners, representatives of partner associations and members of the Club Turgot, which pre-selects the books submitted to a jury of distinguished figures chaired by Jean-Claude Trichet.​ ​​Since the Turgot Prize was established in 1987, the Club has read around 5,000 books and reviewed nearly 4,000, and the jury has awarded, first in the halls of the Senate and then at Bercy, 39 Grand Prizes, 41 Jury Prizes, 102 honourable mentions and 120 Special Prizes (for collective works, educational books, French-language books, young authors, the DFCG and AF2i). ​​Together with Prize winners brought together in the Cercle Turgot, the Club Turgot has also published 22 collective works on key economic and managerial issues. ​​The reviews of the award-winning books have been compiled into three volumes: La pensée économique française (French Economic Thought, 2 volumes) and Les leçons de Turgot et de Smith (The Lessons of Turgot and Smith). For its 100th issue, clubturgot.com presents:  ​​- An original review by Jean-Jacques Pluchart on the impact of Artificial Intelligence on the banking profession, – a presentation by Philippe Alezard on the work of Norbert Wiener, a leading mathematician in the field of finance, – an analysis by Sophie Ffriot of Eric Weil’s book Retraites, un blocage français (Pensions, a French Impasse).

    March 18, 2026 / 0 Comments
    read more

    Wiener’s Process: From Pollen to Financial Markets

    Chroniques

    Phlippe  ​Alezard According to classical economic theory, the price of an asset is determined by the interaction between supply and demand: between a seller wishing to dispose of the asset and a buyer wishing to acquire it. ​​The stronger the demand, for various reasons, the higher the price of the asset will be pushed. ​​Indeed, in theory, demand can be almost unlimited, whereas the asset, by definition, is limited in number. ​​Conversely, when demand is low, the seller seeking to dispose of their asset will tend to lower the price in order to find a buyer. ​​We can therefore understand that, at any given moment, the price corresponds to the point of equilibrium between supply and demand.All of this is true in an ideal, theoretical world. ​​In reality, a wide range of events can occur at any time: geopolitical, climatic, informational, economic or financial events. ​​These events trigger emotional reactions – panic, rumours, euphoria – which in turn lead to human decisions, as well as algorithmic positioning in one direction or another. ​​This multitude of random shocks affects the behaviour of economic agents and creates erratic and unpredictable movements in the short term. ​​It is precisely these random fluctuations, this constant uncertainty, that mathematicians have sought to model in the form of stochastic processes. The first person to have this insight was Louis Bachelier. ​​In his now famous thesis[1] ‘Théorie de la spéculation’ (‘Theory of Speculation’), submitted in 1900, he introduced the use of probabilities to describe price movements and showed that price changes can be represented as a sequence of independent, identically distributed random variables. ​​He went even further by constructing histograms that showed that these variations were distributed according to a bell-shaped curve, in other words, a Gaussian distribution. ​​In this way, Bachelier laid the initial foundations for finance based on Brownian motion, a diffusion process, and the normal distribution.However , the history of Brownian motion begins long before finance. ​​In 1827, Robert Brown[2] used a microscope to observe the persistent agitation of pollen particles suspended in a fluid. ​​This phenomenon can be explained by the incessant and random impacts of the fluid molecules on the particles. ​​The individual movement of each molecule is negligible, but the combined effect of all these impacts produces a completely erratic overall movement. ​​The Brownian motion of a particle can therefore be modelled as a stochastic process characterised by a succession of independent increments, with a mean of zero, whose magnitude and direction vary unpredictably.In finance, the pollen particle becomes the price of an asset suspended in a market. ​​The molecules of the fluid are replaced by the multitude of buy and sell orders, which are themselves driven by an infinite amount of information, events and sometimes conflicting decisions. For a process to be classified as standard Brownian motion, three properties must be satisfied: 1. ​​ ​​ ​​ ​​ ​​ All trajectories start at the origin, or more precisely, in the probabilistic sense, the probability that the trajectory starts at zero is equal to one. 2. ​​ ​​ ​​ ​​ ​​ Each increment of the process is independent of the previous one: future changes do not depend on the past. ​​Brownian motion has no memory. 3. ​​ ​​ ​​ ​​ ​​ The distribution of the increments P(t+1) – P(t) at each instant follows a normal distribution with a mean of zero, whose variance (t+1) – t is proportional to the elapsed time. It was precisely these properties that Bachelier had already observed when studying the prices on the Paris Stock Exchange. ​​However, it was Norbert Wiener who would provide this phenomenon with its rigorous mathematical formalisation. Born in 1894 in Columbia, Missouri, Wiener came from a Russian Jewish family who had immigrated to the United States. ​​His father, Leo Wiener, a translator of Leo Tolstoy’s complete works and later a professor of Slavic languages at Harvard, personally oversaw his son’s education, employing innovative teaching methods. ​​A child prodigy, Norbert received his primary education at home, entered secondary school in 1903 and obtained his equivalent of the baccalaureate in 1906, at the age of twelve. He then attended Tufts University before moving on to Harvard, where he defended a thesis on mathematical logic. ​​At the age of just eighteen, he became the youngest doctoral graduate in the history of this prestigious university. ​​After his thesis defence, he travelled to Europe: in Cambridge, he attended Bertrand Russell’s lectures; in Göttingen, he studied with David Hilbert, one of the greatest mathematicians of the 20th century. Back in the United States, after several temporary positions, Wiener joined MIT in 1919, where he would spend the majority of his career. ​​There, he developed a remarkably diverse body of scientific work, spanning the fields of mathematical analysis, probability, engineering and the philosophy of science. Brownian motion had been known since Brown’s observation in 1827. ​​In 1900, Bachelier had applied it to price fluctuations. ​​In 1905, Albert Einstein published his theory of Brownian motion, while Marian Smoluchowski [3] independently developed an approach based on the random collisions of molecules. These works provided a physical interpretation of Brownian motion. However, a fundamental question remained: how could a continuous random trajectory over time be rigorously defined in mathematics? By 1923, discrete random walks, such as those resulting from a game of heads or tails, were already well understood. ​​At that time, a finite number of random variables were used. ​​However, the transition to continuous time posed a major conceptual challenge: how could a probability be defined over a non-countable infinity of random variables? In his article ‘Differential-Space [4]’, Wiener proposed an elegant solution. ​​He considered the set of possible trajectories as the points of a functional space of infinite dimension. ​​To construct a probability measure on this space, he begins by discretising time by subdividing the interval [0,1] into n segments: 0 = t0 < t1 < t2 … ​< tn =1 He then studied only the increments: X1 = J(t1) – J(0) X2 = J(t2) – J(t1) …. Xn = j(tn) – J(tn−1)  ​​ Three points are crucial: 1. ​​ ​​ ​​ ​​ ​​ It is not the successive positions that are independent, but the displacements over each interval;2 . ​​ ​​ ​​ ​​ ​​ Each increment follows a normal distribution, the variance of which is proportional to the length of the interval.3 . ​​ ​​ ​​ ​​ ​​

    March 18, 2026 / 0 Comments
    read more

    The Impact of Artificial Intelligence on Banking Professions

    Chroniques

    Jean-Jacques Pluchart The spectacular advances in AI – and in particular in generative AI since 2022 – are disrupting the strategies, organisational structures and practices of an increasing number of industries, particularly in the banking and insurance sectors. ​​The scale and speed of these transformations can be seen in the fluctuations in the margins, earnings and share prices of listed institutions. ​​The most erratic fluctuations in certain stock prices reflect the uncertainty felt by savers and investors regarding the ability of banks and insurers to adapt their value creation chains and rebuild their business models.  ​​ Banking businesses are built on the secure management of personal data and the coverage of risks of various kinds. ​​Traditionally, banking businesses are divided into retail banking and wholesale banking. ​​However, they are becoming increasingly differentiated according to the bank’s predominant strategy, which may focus on volume or on the differentiation of its products and services. ​​In the former case, they cover ‘document-intensive’ functions, and in the latter, ‘high-responsibility’ functions. The former encompass the administration of day-to-day operations, the generation of contracts, customer relationship management (CRM), accounting and financial analyses, etc. The latter involve trade-offs between transactions, the issuance of credit, legal, tax and financial arrangements, and strategic decisions, etc. The former can increasingly be replaced by automated processes. ​​The latter can only be supported by dedicated AI-based models for recognition, classification, simulation, projection, correlation, etc.  ​​ ​​ ​​ ​​Distinguishing between these two types of activities is becoming increasingly difficult due to the rapid progress of AI and LLMs, which are based on the massification of data, the acceleration of data processing, the proliferation of specialised AI agents and, above all, the ability to quickly code new programs using natural language (machine learning or automatic encoding). ​​As a result, new functions are being ‘augmented’ by AI: the development of more sophisticated chatbots for interacting with prospects and customers, the security of data and data processing, the systematisation of securities rating, the automation of compliance (due diligence), the optimisation of securities settlement and delivery, etc., as well as the enhancement of the reliability of forecasting models (predictive trading) and the simulation of credit and market risks. ​​Functions that were previously performed by specialists with rare skills are thus becoming ‘commodities’ provided by standard applications (benchmarks). Advances in AI and LLMs are leading to the disintermediation of value creation chains, the reconfiguration of banks’ business models, and the reshaping of their ecosystem. ​​These shifts can be observed in the changes in the margins and stock market prices of banking institutions and their subcontractors. ​​SaaS software outsourcing licences are gradually being replaced by proprietary models generated through ‘Vibe Coding’ at low marginal cost. ​​This transition to token-based pricing has already led to a drop in the MSCI USA Software index. For example, the share prices of Salesforce, Thomson Reuters and LegalZoom have been affected.  ​​The downward trend in margins and valuation multiples is beginning to affect software publishers, property and personal insurance companies, and financial and non-financial rating agencies. Insurify’s launch of a purchasing agent capable of instantly comparing millions of policies caused a sharp drop in the value of Willis Towers Watson and Aon. ​​In the fields of accounting (auditors, analysts) and credit rating, the same phenomenon has affected certain agencies, such as S&P Global, Moody’s and FactSet.​ Retail banks, which focus on providing advice and credit to individuals and SMEs, are directly exposed to a loss of competitive advantage unless they demonstrate their ability to adapt quickly to the changes brought about by AI. In contrast, investment or merchant banks benefit from barriers to entry based on the personalisation of client relationships (i.e., on trust and personalised historical data), on financial, legal and tax structuring (M&As, major projects, etc.), particularly at the international level, on wealth management, on the securitisation of receivables, on the management of derivatives, on strategic decisions, and even on certain functions related to shadow banking (management of investment funds or tax avoidance schemes, etc.). ​​The quality of the banking relationship creates value when it is developed during a monetary and financial crisis or simply in a volatile market environment. ​​The involvement of a human adviser provides the client with ‘mental and emotional well-being’ and greater confidence in the future. This transformation of banking models prompts us to revisit the lessons taught by Michael Porter since the 1980s, which distinguish between corporate strategies based on volume and those ​​ based  ​​on service differentiation. ​​It appears that, in the wake of advances in AI, these lessons are once again becoming increasingly relevant. ​​Banks are being  ​​compelled to adopt strategies that focus on innovative and phygital activities, combining these two approaches.

    March 18, 2026 / 0 Comments
    read more

    Basel IV: a new name for a strengthened Basel III

    Chroniques

    Over the past few years, one expression has become common across banks: Basel IV. It is heard in ALM committees, risk departments and discussions between finance teams and business lines. Yet officially, this term does not exist. Regulators continue to refer to the “finalisation of Basel III”. This distinction is not merely semantic — it reflects the very philosophy of the reform. In reality, Basel IV is not a new regulatory framework. It is a strengthened, refined and more harmonised version of Basel III. The objective remains unchanged: to reinforce the resilience of the banking system after the 2008 financial crisis. However, years of implementation have revealed a key issue — comparable banks could report significantly different capital levels depending on their internal models. The recent adjustments aim primarily to reduce these discrepancies. At the core of the reform lies the output floor. Its principle is straightforward: risk-weighted assets calculated using internal models cannot fall below 72.5% of those calculated under standardised approaches. Sophistication is still allowed, but it can no longer endlessly reduce capital consumption. This represents a major shift in logic. For years, the ability to develop advanced internal models was a competitive advantage. That advantage is now framed within clear boundaries. Models are not disappearing, but they are being brought back into a common corridor. This is one of the main reasons why banks have adopted the term Basel IV: the philosophy has changed. The industry is moving from a system where optimisation played a central role to a more harmonised and comparable framework. But this is not the only evolution. Credit risk has been significantly revised. The scope of internal models has been restricted, and key parameters are now subject to floors. Standardised approaches have become more risk-sensitive, particularly for real estate exposures, specialised lending and off-balance-sheet commitments. The objective is clear: to prevent structural underestimation of risk. Operational risk is also evolving. Former methodologies — often complex and highly dependent on internal modelling — are being replaced by a simpler, standardised approach. Once again, the guiding principles are comparability and readability. Counterparty risk and market risk are also being reshaped. The Fundamental Review of the Trading Book (FRTB), postponed to 2027 in Europe, redefines the boundary between the banking book and the trading book while strengthening sensitivity to market conditions. Taken individually, these adjustments may appear technical. Taken together, they produce a structural effect: a reduction in banks’ room for interpretation when calculating capital requirements. Yet the most silent transformation may lie elsewhere — in data. The reform goes beyond capital ratios. It deeply reshapes prudential reporting and external disclosure. Data reported to supervisors and data published under Pillar 3 are now converging. Every prudential figure becomes potentially public. This increase in transparency raises the bar significantly in terms of data quality, consistency and traceability. Banks must now produce two views of capital: one based on internal models and another incorporating the prudential floor. This dual perspective changes internal steering. An activity that appeared efficient under historical modelling assumptions may become more capital-consuming once the floor applies. This shift explains why the term Basel IV has gained traction. It may not represent an official regulatory break, but it clearly reflects an operational one. Why, then, do supervisors avoid this term? Because they want to emphasise continuity. From their perspective, this is not a new philosophy but the logical completion of the post-crisis framework. Acknowledging a Basel IV would imply that Basel III was incomplete. Why, on the other hand, do banks continue to use it? Because it helps describe an internal change of paradigm. Capital steering becomes more constrained, more standardised and increasingly data-driven. The impact on pricing, capital allocation and commercial strategy is significant enough to justify a new label in everyday language. In practice, the truth lies somewhere in between. Yes, Basel IV does not exist legally. Yes, it is the final stage of Basel III. But yes as well: for banks, the transformation is deep enough to feel like a new chapter. The gradual implementation through 2033 confirms that this is not a simple technical update. It is a long adaptation phase requiring changes in organisations, systems and management culture. Ultimately, the real novelty is not regulatory — it is managerial. Capital becomes a resource to be actively managed in real time, just like liquidity or commercial profitability. Business decisions will increasingly need to integrate prudential considerations from the outset. Perhaps that is the best definition of what the market calls Basel IV: not a new standard, but a Basel III that has reached maturity. And as often in banking, what changes most is not the rulebook itself — but the way institutions learn to live with it. Benoit Frayer

    March 11, 2026 / 0 Comments
    read more

    AFRICA, A BEACON OF MODERN MANAGEMENT

    Chroniques

    Jean-Jacques Pluchart Management in post-colonial Africa has long been reduced to the management of SMEs or domestic cooperatives and the administration of subsidiaries of Western groups. ​​Recent African literature [1] shows that the continent is going through a period of transition marked by the redeployment of sectors of activity such as agri-food, energy, construction, tourism, as well as health and telephony sectors. ​​These transformations are being driven by a new African elite, itself stimulated by a youth that refuses to be condemned to emigration and wishes to adapt to the changes of the contemporary world. ​​This elite is aware that Africa is the richest of all continents in terms of its raw material resources, its youth (more than a third of the population), its social structures (more than 2,000 ethnic groups) and its cultures. This elite, trained in European and American universities is striving to ease the tension between the legacy of post-colonial traditions and post-modern models based on innovation, individualism and freedoms. ​​It notes that too many African states are still victims of political instability, institutional precariousness, social inequalities and lack of funding.​It is confronted with struggles between governments that are rarely democratic, but above all, between ​ local castes for the appropriation of income from the exploitation of natural resources. ​​It is confronted by certain Western (increasingly less European), Asian (increasingly Chinese) and Russian multinational groups. ​​African nation-states are traversed by convergent and divergent dynamics. ​​They are striving to settle the legacy of colonisation, to consolidate their access to independence, to achieve their full sovereignty and to influence the balance of power between the continents of the west, the east and the global south.  ​​African youth want the emergence of an original managerial model, the advent of an “active African modernity”, conceived as an adaptation of Anglo-Saxon models, and as a “plural construction, combining science and conscience, but also reflecting the realities of the continent”. ​​It perceives modernity as “a historical construction aimed at freeing the individual from certain social and cultural constraints “. ​​This modernity is characterised by a loosening of traditional practices, a search for pragmatism and above all,  ​​an appropriation of the technologies of the digital economy – and in particular of AI and crypto-assets – as well as by the development of large environmentally friendly infrastructure projects. ​​It seeks new types of alliances, cooperation and investment, based on more balanced exchanges.  ​​The priorities of the new African leaders and managers are to ensure more stable, inclusive and transparent governance, better regional integration and a stimulating dynamic for young people. ​​The new management methods are oriented towards the creation of companies by African youth, aiming to better exploit local resources and to develop skills in the professions sought by international investors. ​​Another priority of the cooperation is to train Africans in specialities that contribute to the achievement of the Sustainable Development Goals (SDGs), in particular SDG 1 (“No poverty”), SDG 2 (“Zero hunger”), SDG 5 (Gender equality), SDG 8 (Decent work and economic growth), SDG 10 (Reduced inequalities and social inclusion) and SDG 13 (Combating climate change). The impetus for this dynamic is collaborative research and educational innovation, in which French-speaking teacher-researchers must take part. [1] Literature commented on each week on clubturgot.com.

    February 25, 2026 / 0 Comments
    read more

    The geoeconomic stakes and challenges of artificial intelligence

    Chroniques

    The geoeconomic stakes and challenges of artificial intelligence Nadia ANTONIN  ​​Power factors have changed in the contemporary period. ​​Recent examples of power politics reveal that economic means have become essential in the exercise of power by states. ​​This paradigm shift is summarised by the term “geo-economics”. ​​Economic interests take precedence over political interests.  ​​While in classical geopolitical analysis, it was in the territory and military means that the State found the basis of its sovereignty, geoeconomics assumes that power and security are not only linked to the physical control of the territory. ​​Today, the objective is to acquire or maintain a dominant economic position on the international scene, in particular by mastering innovation. ​​This is the thesis defended by the economist and historian Edward Luttak. ​​In the early 1990s, this economist analysed how the decline in the importance of military power led to a shift from geopolitics to geoeconomics, “States and economic entities replacing the armed forces as the main actors (“From Geopolitics to Geo-economics. ​​Logics of Conflict, Grammar of ​​Commerce”, The National Interest, Summer 1990).  ​​Among the new instruments of economic and geopolitical power, artificial intelligence is today a major lever, an instrument of strategic competition and a factor in the reconfiguration of global industrial power relations in the same way as energy, finance or armaments.  ​​Analysis of the geoeconomics concept  ​​ For Professor Antto Vihma, geoeconomics “consists […] in shaping and managing the strategic environment in which states operate for the pursuit of their national interests by economic means“ (2018). The concept of “geoeconomics” was developed in the United States by Edward Luttwak and in France by the political scientist Pascal Lorot, who created the journal Géoéconomie in 1997. ​​It covers the study of the relationship between the economy and the political geography of a territory. ​​A branch of geopolitics, geoeconomics is at the crossroads of economics and international relations.  ​​Luttwak (1990) highlighted geoeconomics as “a modern form of rivalry between international powers for which the role of the State and its financial capacities constitute the foundations of diplomacy and the “economic” deployment of nations“. ​​As for Lorot (1999), he sees in geoeconomics “an instrument for analysing the economic and commercial strategies determined by the policies and diplomacy of national governments seeking the reciprocal international influence of these national enterprises and the State“.  ​​Block strategy for a dominant position  ​​A global struggle for artificial intelligence has entered the international arena. ​​In other words, as artificial intelligence has become a major geoeconomic issue, the major powers of the United States, China and the European Union (EU) are engaged in a technological and economic race to carve out the lion’s share. ​​Each is seeking to master the technologies, infrastructures and standards that will shape the economy of the 21st century.  ​​AI is redefining the global hierarchy: the United States dominates through innovation, China through state strategy, and the European Union through standards. – ​At present, only the USA holds a monopoly on AI and has managed to rise to the rank of leader in the AI race, despite China’s rise in this field. ​​China is the main rival of the United States. ​​The US’s supremacy in AI innovation is based on three pillars: 1) academic excellence: 2) Silicon Valley, which remains the heart of the technology and innovation industry; 3) significant public and private funding.  ​​ – ​China is accelerating the integration of AI in all sectors. ​​In addition, a major revision of the Chinese cybersecurity law (October 2025) reflects the state’s desire to technically and ethically regulate the meteoric growth of artificial intelligence in various applications. ​​By updating its legislative framework, China is seeking not only to assert itself on the world stage as a leader in cybersecurity, but also to better manage the risks associated with AI and infrastructure.  ​​- ​The European Union has chosen the standard as a strategic and competitive weapon in order to impose its vision and conquer new markets. ​​Wanting to become a pioneer in “a field that fascinates as much as it worries“, the European Union has taken the lead over the USA or China, to regulate AI. ​​Thus, on 14 June 2023, the European Parliament largely approved the draft regulation on artificial intelligence, the Artificial Intelligence Act, which entered into force in 2025.  ​​The geoeconomic challenges of artificial intelligence  ​​The deployment of AI raises geoeconomic issues, particularly in terms of resources, ethics, employment and regulation.  ​​- Artificial intelligence depends on three essential resources: data (“new black gold”), semiconductors and computing power. ​​The battle for these resources is at the heart of current geoeconomic tensions.  ​​- ​In addition, given the biased algorithms, the discrimination that AI can generate, security issues and data protection, etc., the adoption of this technology raises ethical questions. ​​The AI economy must be based on trust.  ​​- ​The rise of artificial intelligence is profoundly transforming global geoeconomic balances, in particular through its effects on employment. ​​Three major issues can be identified: 1) first, global competition for talent is intensifying; 2) second, AI is causing a recomposition of global value chains; 3) finally, AI is destroying intermediate jobs, while promoting the creation of highly skilled jobs, requiring a set of advanced technical and cognitive skills.  ​​- ​With regard to regulation and digital sovereignty, countries are seeking to regulate AI to protect their economic interests and values. In conclusion, the countries that succeed in innovating quickly, while ethically and legally regulating the uses of AI and preserving their technological sovereignty, will be the big winners of the AI revolution.

    February 18, 2026 / 0 Comments
    read more

    Corporate culture, the best defence against storms

    Chroniques

    Business leaders have always learned to juggle the unpredictable. ​​In a world today marked by successive crises — health, geopolitical, climate, economic — resilience is an essential quality. ​​Whatever the size of the company, there are leveraging effects that will promote this resilience. ​​There is one that remains largely underestimated: corporate culture. Sometimes reduced to a few symbols or values, corporate culture is too often confined to a human resources component. ​​However, it is a strong dimension of the company, a set of values and behaviours capable of driving innovation and growth. ​​When everything is faltering, it is not the tools or processes that will save the company. ​​These are often invisible elements, which do not appear on the company’s balance sheet or in the profit and loss accounts. ​​Corporate culture exists from the moment the entrepreneur starts his or her entrepreneurial adventure surrounded by his or her first employees. Indeed, in a company where the culture is strong, clear and embodied, employees know how to behave, what matters, and what is expected of them, even when the benchmarks change. ​​This makes it possible to react quickly, to reorganise without chaos, and to remain faithful to the company’s identity without succumbing to organisational panic. Conversely, a company without a shared culture quickly becomes a field of tension: power games, inertia, strategic misalignment. ​​In a storm, the absence of culture is paid for in cash. So, is corporate culture a brake on innovation or growth? ​​This is a common mistake, but one that has never been proven! ​​Corporate culture creates a secure framework that enables strategies to be implemented. ​​So many business leaders wonder about the causes of the failure of their strategies: the answer probably lies in the presence or absence of a corporate culture! Corporate culture is also the remedy to prevent the company from going through difficult times. ​​In other words, there can be no innovation without a strong culture. ​​Companies that innovate sustainably are often those where the culture values curiosity, questioning, cooperation and responsibility. ​​Not those where innovation is confined to an isolated unit or driven by sterile indicators. We can ask ourselves about the causes of the sharp rise in business failures in our country. ​​The lack of corporate culture could be one of the causes of failure, because when the culture no longer supports the strategy, the company becomes misaligned and takes the risk of becoming stagnant. ​​But a ship that does not move forward moves backwards! What if corporate culture was simply the conscience of the legal entity? ​​This consciousness capable of perceiving and interpreting events. ​​But this is without counting on the cognitive biases that are increasingly being put forward to warn us about what we think we see when the reality is quite different. ​​Corporate culture is no exception. ​​It is up to the entrepreneur to preserve and maintain this cognitive capital: it is vital for their business! The world of tomorrow will remain uncertain… that’s for sure! ​​Crises will continue, in other forms. ​​Innovation will remain vital, but difficult to maintain without benchmarks. ​​In this context, corporate culture becomes a factor of strategic robustness. ​​It does not protect against storms: it gives the posture to cross them, and sometimes even to emerge strengthened. François NAUX is Managing Director of WICS Consulting, independent director, ​ member of APIA and member of the Turgot Club. Column published in the newspaper Les Echos, 1 December 2025; ​​

    February 18, 2026 / 0 Comments
    read more

    Survey of French managers who have become lecturer-researchers in management sciences

    Chroniques

    Jean-Jacques Pluchart In January 2026, the FNEGE (National Foundation for Business Management Education) published a study on Practitioners Becoming Researchers (PDC), aimed at “understanding their backgrounds and motivations, supporting future candidates in this transition, and highlighting the specific strengths of these profiles for teaching and research in management sciences “.​The study uses a methodology based on a survey carried out between January and March 2025, which collected 207 responses, including 124 teacher-researchers who stated that they had been working for between five and twenty years. ​​The survey reveals that a  ​quarter  ​of respondents work in public institutions, half in private schools and the last quarter abroad. The study distinguishes three profiles of PDCs: “meaning-oriented” (of existence), “intellectual challenge” (faced with the theoretical corpus), and “balanced” (between professional and private lives). ​​The PDCs underline the scale of the challenges to be met by the PDCs: research training, assimilation of scientific literature, drop in financial income, limitation of career prospects, acculturation to the academic environment (often distant),  ​confrontation with a certain psychological distress (due to uprooting), etc. However, they recognize that they have assets favorable to their transition: field experience, mobilization of professional networks for research, better understanding of managerial issues and challenges, less rigid hierarchy, legitimacy with students, etc. They particularly appreciate being able to carry out parallel associative, elective, sports  ​or consulting activities. ​​They ​acknowledge having underestimated the effort of retraining and the difficulty of accessing the status of university professor. ​​Overall, they do not regret their choice of radical retraining. The authors of the study recommend that PDCs “adopt a learning posture, accept the status of novice, develop a critical distance from their previous practice, set up companionship systems promoting the transmission of codes, methods and postures of research, create places of collective reflexivity (seminars, peer groups) mobilizing the diversity of courses, organize webinars informing practitioners about the reality of the profession of EDC, and promote projects enhancing the professional expertise of PDCs”. The study shows that the multiplication of PDCs contributes to a better synergy between theorists and practitioners, between the university and the company, between fundamental research and applied research.  ​​ ​​Reference: https://us02web.zoom.us/webinar/register/wn_57sf8hq_rdol5jh987mkxq

    February 4, 2026 / 0 Comments
    read more

    AMERICAN STABLECOIN VERSUS DIGITAL EURO

    Chroniques

    Jean-Jacques Pluchart The digitization of currencies is currently marked by a rivalry between two models, that of stablecoins backed by the US dollar and that of the digital euro representing the Central Bank Digital Currency of the eurozone. ​​This rivalry reflects the opposition between two fundamental systems of the economy, one being mainly regulated in the short term by the market and the other being governed in the long term by the State. ​​It therefore reflects a new “tragedy of horizons”, according to Mark Carney’s famous formula delivered in 2015. The digital euro system, approved by the European Council on 19 February 2025, should be in place by 2028-2029, following an institutional timetable defined by the European Central Bank, while the stablecoin system based on the dollar is already in full development. ​​Monetary stablecoins  ​strive to combine the flexibility of bitcoins and the stability of fiat currencies issued by central banks.  ​​They are in principle guaranteed by solid assets such as US Treasury bills and Fed reserves, and their transactions are secured  ​by the blockchain. ​​Their assets are liquid and are not remunerated so as not to compete with banking products. ​​In principle, they only perform the functions of payment (in particular cross-border) and “bridge currency” between currencies. ​​In countries with unstable economies, they can act as an informal store of value. ​​However, they do not fully comply with certain criteria required of systemic currencies. ​​Unity between issuers is not guaranteed in the event of a crisis, and competition and therefore a hierarchy can then be established between them. ​​The elasticity of stablecoins may be affected in the event of a monetary shock, as the system is not guaranteed by a central bank. ​​In the absence of a lender of last resort, cryptocurrencies are suspected of being procyclical. ​​Due to the decentralization of their issuers and the differences between their national regulations, the integrity  ​of the stablecoin system and the protection of their users may be threatened.  ​​ The market for monetary stablecoins also has an almost monopolistic structure, being largely dominated by the issuers Theter and Circle.  ​​Their consolidated outstanding amount is estimated at $350 billion as of 31 December 2025. ​​But due to the uncertainties weighing on their future, the capitalization of stablecoins backed by the US currency is estimated at between $500 and… $3,700 billion by 2030. ​​Other types of stablecoins that can be backed by baskets of cryptocurrencies, indexed to metal indices (such as gold) or commodities, are generally unsecured and are marked  ​by even more erratic fluctuations. According to some economists, especially Anglo-Saxon ones, an inactive and unpaid MNBC cannot replace the large-scale adoption of stablecoins distributed by platforms. They consider that the risk of a threat by stablecoins to monetary sovereignty can only arise in payment networks, because the value of a national  ​or federal currency is stabilised by the balance sheet of the Central Bank, and the  ​convertibility between deposits is guaranteed by banking regulations. ​ According to most monetary economists, the use of dollar-backed stablecoins is still limited (at 11% of global monetary assets at the end of 2025), because they are directly competed with by bank savings products, and above all, because they can indirectly suffer from the erosion of confidence in major currencies  ​such as the US dollar. ​​Monetary stablecoins therefore risk suffering the fate of certain private currencies that have now disappeared, which have marked world monetary history.

    January 28, 2026 / 0 Comments
    read more

    A new consumer credit reform to combat over-indebtedness

    Chroniques

    ​​Nadia Antonin  ​​The Banque de France is sounding the alarm: according to the latest monthly barometer of financial inclusion published on 11 December 2025,  ​the number of over-indebtedness files filed with the Banque de France increased by 8.9% over the first eleven months of 2025 compared to the same period in 2024. ​​This increase intensified in November 2025, with an increase of 12.8% compared to November 2024. In addition, according to the Banque de France’s 2024 typological survey published in February 2025, nearly one in two debt files (43%) includes at least one consumer loan.  ​​The rise of consumer credit In view of technological developments and changes in consumer habits, consumer credit has experienced strong development in recent years. ​​It is an essential lever for French households. ​​It allows them to finance personal projects such as a car, a trip, household appliances, or to overcome cash flow difficulties. ​​In addition, it represents a strategic activity for banking and financial organisations.  ​​Consumer credit is credit granted to individuals for the purchase of a manufactured object, as opposed to production credit granted to the manufacturer.  ​​It concerns transactions other than those related to real estate. ​​Its amount is between €200 and €75,000 and its repayment period is greater than three months.​It is governed by the Consumer Code, which lays down a set of rules relating to the content and conclusion of the contract.  ​​There are several forms of consumer credit: – ​the personal loan ​which makes it possible to finance a project without having to justify the use of the funds; – the assigned credit linked to the purchase of a specific good or service; – revolving credit consisting of a credit line, accessible at any time and which is renewed over the course of repayments; – ​the lease with purchase option which makes it possible to rent a property (for non-professional use) with the option of buying it at the end of the contract at a price fixed from the start; – personal microcredit, which concerns people who do not have access to traditional bank loans, due to low income or a situation of financial precariousness.  ​​Consumer credit and over-indebtedness Appearing to be attractive, consumer credit can quickly become a financial trap if you do not understand how it works. ​​It is considered to favour over-indebtedness, that is to say a situation in which a person (or a household) is no longer able to honour their debts (monthly loan payments, bank overdrafts, rents, etc.).  ​​Over-indebtedness developed in the late 1980s following the lifting of credit controls in a context of strong growth in consumer credit. ​​Despite the passing of the Neiertz law in December 1989 to combat over-indebtedness, the number of filings has continued to grow. ​​Faced with the growing flow of cases, the government adopted the Lagarde law on 1 July 2010, which imposes strict regulations on consumer credit for amounts between €200 and €75,000 and with a minimum duration of 3 months.  ​​To circumvent the Lagarde Law, two new types of loans, not subject to the provisions of the Consumer Code, have appeared: the “split payment”, which consists of paying for a purchase in 3 or 4 instalments, and the “mini-credit”, which is a loan of a few hundred euros. ​​These rapid payment facilities, often perceived as harmless, have led more and more French people into over-indebtedness. ​​In 2024, 17% of over-indebtedness files contained a mini-credit or a split payment, compared to only 1% in 2022 and 7% in 2023 (Source: Observatory of Banking Inclusion).  This sharp increase in over-indebtedness prompted the French government to transpose, by order of 3 September 2025, the European Directive (EU) 2023/2025 of 18 October 2023 on consumer credit agreements.  ​​Tightening of conditions for access to consumer credit This new reform, which is due to come into force on 20 November 2026, aims to more strictly regulate access to credit and strengthen the protection of borrowers. ​​The new provisions are as follows: – ​an extension of the scope of consumer credit regulations: until now, only transactions between €200 and €75,000 were regulated. ​​The new rules will apply from the first euro up to €100,000, and will include split payment, mini-credit, bank overdraft and lease with option to purchase within the scope; – a careful study of the borrower’s solvency and a consultation of the Banque de France’s Personal Loan Repayment Incidents File (FICP) before any signing of the contract; – highly regulated advertising. ​​This must be “clear, fair and not misleading”; – a strengthening of information obligations before signing the contract. ​​The standardised preliminary form for the issuance of the offer (FIPEN) given to the borrower will have to describe precisely the characteristics of the contract; – support for the borrower experiencing financial difficulties. ​​The lender will have to support them by offering solutions around renegotiation, rescheduling, spreading, extending the duration of the loan, etc., and directing them towards advisory services; – a new sanctioning power: control and sanctions will now be entrusted to the Directorate General for Competition, Consumer Affairs and Fraud Control (DGCCRF).  This new reform of consumer credit to combat over-indebtedness has thoroughly modernised the provisions applicable in France in order to protect borrowers and harmonise the rules at European level. ​​It introduces a requirement for transparency, ethics and accountability.

    January 28, 2026 / 0 Comments
    read more

    Posts pagination

    1 2 … 4 Next

    Last Parutions

    WEIL, Eric. Retraites, un blocage français, Editions PLON, 2025, 208 pages
    March 18, 2026
    Read More
    RAVEAUD Gilles. Inflation. La Grande Arnaque, Editions Les échappés, 2025, 170 pages
    March 11, 2026
    Read More

    Last Chronicles

    Clubturgot.com, 100th edition
    March 18, 2026
    Read More
    Wiener’s Process: From Pollen to Financial Markets
    March 18, 2026
    Read More