Search icon CANCEL
Subscription
0
Cart icon
Your Cart (0 item)
Close icon
You have no products in your basket yet
Save more on your purchases now! discount-offer-chevron-icon
Savings automatically calculated. No voucher code required.
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Conferences
Free Learning
Arrow right icon
Arrow up icon
GO TO TOP
Data Engineering with Scala and Spark

You're reading from   Data Engineering with Scala and Spark Build streaming and batch pipelines that process massive amounts of data using Scala

Arrow left icon
Product type Paperback
Published in Jan 2024
Publisher Packt
ISBN-13 9781804612583
Length 300 pages
Edition 1st Edition
Languages
Arrow right icon
Authors (3):
Arrow left icon
Rupam Bhattacharjee Rupam Bhattacharjee
Author Profile Icon Rupam Bhattacharjee
Rupam Bhattacharjee
David Radford David Radford
Author Profile Icon David Radford
David Radford
Eric Tome Eric Tome
Author Profile Icon Eric Tome
Eric Tome
Arrow right icon
View More author details
Toc

Table of Contents (21) Chapters Close

Preface 1. Part 1 – Introduction to Data Engineering, Scala, and an Environment Setup
2. Chapter 1: Scala Essentials for Data Engineers FREE CHAPTER 3. Chapter 2: Environment Setup 4. Part 2 – Data Ingestion, Transformation, Cleansing, and Profiling Using Scala and Spark
5. Chapter 3: An Introduction to Apache Spark and Its APIs – DataFrame, Dataset, and Spark SQL 6. Chapter 4: Working with Databases 7. Chapter 5: Object Stores and Data Lakes 8. Chapter 6: Understanding Data Transformation 9. Chapter 7: Data Profiling and Data Quality 10. Part 3 – Software Engineering Best Practices for Data Engineering in Scala
11. Chapter 8: Test-Driven Development, Code Health, and Maintainability 12. Chapter 9: CI/CD with GitHub 13. Part 4 – Productionalizing Data Engineering Pipelines – Orchestration and Tuning
14. Chapter 10: Data Pipeline Orchestration 15. Chapter 11: Performance Tuning 16. Part 5 – End-to-End Data Pipelines
17. Chapter 12: Building Batch Pipelines Using Spark and Scala 18. Chapter 13: Building Streaming Pipelines Using Spark and Scala 19. Index 20. Other Books You May Enjoy

Variance

As mentioned earlier, functions are first-class objects in Scala. Scala automatically converts function literals into objects of the FunctionN type (N = 0 to 22). For example, consider the following anonymous function:

val f: Int => Any = (x: Int) => x

Example 1.45

This function will be converted automatically to the following:

val f = new Function1[Int, Any] {def apply(x: Int) = x}

Example 1.46

Please note that the preceding syntax represents an object of an anonymous class that extends Function1[Int, Any] and implements its abstract apply method. In other words, it is equivalent to the following:

class AnonymousClass extends Function1[Int, Any] {
  def apply(x: Int): Any = x
}
val f = new AnonymousClass

Example 1.47

If we refer to the type signature of the Function1 trait, we would see the following:

Function1[-T1, +T2]

Example 1.48

T1 represents the argument type and T2 represents the return type. The type variance of T1 is contravariant and that of T2 is covariant. In general, covariance designed by + means if a class or trait is covariant in its type parameter T, that is, C[+T], then C[T1] and C[T2] will adhere to the subtyping relationship between T1 and T2. For example, since Any is a supertype of Int, C[Any] will be a supertype of C[Int].

The order is reversed for contravariance. So, if we have C[-T], then C[Int] will be a supertype of C[Any].

Since we have Function1[-T1, +R], that would then mean type Function1[Int, Any] will be a supertype of, say, Function1[Any, String].

To see it in action, let’s define a method that takes a function of type Int => Any and returns Unit:

def caller(op: Int => Any): Unit = List
  .tabulate(5)(i => i + 1)
  .foreach(i => print(s"$i "))

Example 1.49

Let’s now define two functions:

scala> val f1: Int => Any = (x: Int) => x
f1: Int => Any = $Lambda$9151/1234201645@34f561c8
scala> val f2 : Any => String = (x: Any) => x.toString
f2: Any => String = $Lambda$9152/1734317897@699fe6f6

Example 1.50

A function (or method) with a parameter of type T can be invoked with an argument that is either of type T or its subtype. And since Int => Any is a supertype of Any => String, we should be able to pass both of these functions as arguments. As can be seen, both of them indeed work:

scala> caller(f1)
1 2 3 4 5
scala> caller(f2)
1 2 3 4 5

Example 1.51

You have been reading a chapter from
Data Engineering with Scala and Spark
Published in: Jan 2024
Publisher: Packt
ISBN-13: 9781804612583
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at €18.99/month. Cancel anytime