Implementing logistic regression in Swift
The most important differences of this implementation from multiple linear regression are the following:
- Normalization is required only for feature matrix x, and not for the target vector y, because the output has range (0, 1)
- The hypothesis is different
- The cost function looks different, but the cost gradient remains the same
Again, we'll need some accelerate functions:
import Accelerate
The logistic regression class definition looks similar to multiple linear regression:
public class LogisticRegression { public var weights: [Double]! public init(normalization: Bool) { self.normalization = normalization } private(set) var normalization: Bool private(set) var xMeanVec = [Double]() private(set) var xStdVec = [Double]()
The prediction part of logistic regression
This is the code that implements hypotheses for one sample input and for a matrix of inputs:
public func predict(xVec: [Double]) -> Double { if normalization { let input...