Search icon CANCEL
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Conferences
Free Learning
Arrow right icon
Arrow up icon
GO TO TOP
Functional Programming in Go

You're reading from   Functional Programming in Go Apply functional techniques in Golang to improve the testability, readability, and security of your code

Arrow left icon
Product type Paperback
Published in Mar 2023
Publisher Packt
ISBN-13 9781801811163
Length 248 pages
Edition 1st Edition
Languages
Tools
Arrow right icon
Author (1):
Arrow left icon
Dylan Meeus Dylan Meeus
Author Profile Icon Dylan Meeus
Dylan Meeus
Arrow right icon
View More author details
Toc

Table of Contents (17) Chapters Close

Preface 1. Part 1: Functional Programming Paradigm Essentials
2. Chapter 1: Introducing Functional Programming FREE CHAPTER 3. Chapter 2: Treating Functions as First-Class Citizens 4. Chapter 3: Higher-Order Functions 5. Chapter 4: Writing Testable Code with Pure Functions 6. Chapter 5: Immutability 7. Part 2: Using Functional Programming Techniques
8. Chapter 6: Three Common Categories of Functions 9. Chapter 7: Recursion 10. Chapter 8: Readable Function Composition with Fluent Programming 11. Part 3: Design Patterns and Functional Programming Libraries
12. Chapter 9: Functional Design Patterns 13. Chapter 10: Concurrency and Functional Programming 14. Chapter 11: Functional Programming Libraries 15. Index 16. Other Books You May Enjoy

A brief history of functional programming

If you take a look at the mainstream languages of the past decade, you will notice how the prevailing programming paradigm is OOP. This might lead you to believe that FP is an upstart paradigm, one that is in a young state compared to the well-established object-oriented approach. Yet, when we look at the history of FP, we can trace its roots all the way back to the 1930s, quite some time before we talked about programming in the modern-day sense.

The roots of FP can be traced back to the Lambda calculus, which was developed in the 1930s by Alonzo Church. This was developed as a formal system based on function abstraction and application, using variable binding. There are two variants of this calculus; it can either be typed or untyped. This is directly parallel to how programming languages today are either statically typed, such as Java and Go, or dynamically typed such as Python. The Lambda calculus was proven to be Turing-complete in 1937 – again, similar to how all mainstream programming languages today are Turing-complete.

The Lambda calculus predates modern programming by a few decades. To get to the first actual code that could be thought of as FP, in the way that we understand programming today, we have to move forward a few decades. The LISt Processor (LISP) was originally created in the 1950s as a practical application of mathematical notation. This was influenced by the Lambda calculus laid out in the 1930s by Church.

LISP can be thought of as the first FP language that reached some sense of popularity. It was especially popular in the field of artificial intelligence research, but through the decades, made its way to industry. Derivatives of LISP continued to be popular for a long time, with notable achievements such as the Crash Bandicoot game, and Hacker News being written in derivatives of this language.

LISP was developed in the late 1950s by John McCarthy. To define LISP functions, he took inspiration from the Lambda calculus developed by Church. It was extended beyond the mathematical system by introducing recursion, a fundamental concept for how functional languages work. Beyond recursion, LISP also treated functions as first-class citizens and pushed innovation in programming language design by including things such as garbage collection and conditional statements.  

In the early 1960s, Kenneth E. Iverson developed A Programming Language (APL). APL is, again, a functional language that is perhaps most known for its use of symbols and terse code.  For example, the following is an image of the code snippet that would generate Conway’s Game Of Life:

Figure 1.2: Conway’s Game of Life in APL

Figure 1.2: Conway’s Game of Life in APL

Jumping ahead another decade, in 1973, we get a language called Meta Language (ML). This language introduced the polymorphic Hindley-Milner type system – that is to say, a type system in which types are assigned automatically without requiring explicit type annotations. In addition, it supported features such as function currying, which we will apply to our functional Go code later in this book. It also supports pattern matching on the arguments of a function, as we can see in the following snippet of a function to compute the factorial of a number:

fun fac 0 = 1
  | fac n = n * fac (n – 1)

The pattern matcher in this example will take a look at what the input value is to the fac function, and then either continue with the first line if the input value is 0, or the second line in all other cases. Notice that this is also a recursive function expressed quite beautifully. Sadly, pattern matching will not be explored further in this book, as Go currently offers no way of doing this. We will see a way of doing a similar type of function dispatching using maps and higher-order functions.

In 1977, the language called FP was created. It was developed by John Backus specifically to support the FP paradigm. While the language itself did not get much traction outside of academia, the paper in which it was introduced (Can programming be liberated from the von Neumann style?) did spark a renewed interest in FP.

In the same decade as ML and FP, another language called Scheme was developed. This is the first dialect of LISP that used lexical scoping and tail-call optimization. Tail-call optimization led to the practical implementation of recursive algorithms. While the details of tail-call optimization will be discussed in Chapter 7 of this book, briefly stated, it allows recursive algorithms to be implemented in an efficient way and without using more memory than a traditional loop would, thus eliminating the “stack overflow exceptions” that otherwise would happen during deep recursion.

Scheme is one of the most influential LISP dialects to have been created, and continues to this day to have some popularity. Although it was created in 1975, the last standard was defined as recently as 2013 (R7RS-Small). Scheme, in turn, influenced other LISP dialects, the most notable of which is perhaps Common Lisp. Interestingly, although having roots in FP, Common Lisp introduced the Common Lisp Object System (CLOS). The CLOS facilitated OOP in LISP. With this, we can perhaps consider LISP a truly multi-paradigm language, not unlike Go.

The final language to look at before we jump to contemporary functional languages is Miranda. Miranda is a lazy, purely FP language. The key concept that was introduced here is lazy evaluation. When a language is said to support lazy evaluation, this means that an expression is not resolved until the value is actually needed. It can be used, for example, to implement infinite data structures. You could define a function that generates all Fibonacci numbers, a sequence that never ends – but, rather than creating the entire list (which is not possible), it will only generate a subset of that list that is relevant to the problem you are solving. For example, the following snippet of Miranda code computes all square numbers:

squares = [ n*n | n <- [0..] ]

With that, we have arrived at the next language to discuss briefly, namely Haskell.

You have been reading a chapter from
Functional Programming in Go
Published in: Mar 2023
Publisher: Packt
ISBN-13: 9781801811163
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at $19.99/month. Cancel anytime