source: Spotify

What made me share my work

  • Ever since I started using Spotify on a day-to-day basis I’ve wanted to re-create something from the app. I’m a big fan of finding out what the writer wanted to say through their work. That’s why the way Spotify embedded the lyrics from a song when you listen to it really caught my attention.
  • I managed to implement the animation and then forgot about even doing it in the first place.
  • “Wasting time” on Twitter I found one tweet which made me think.

This tweet made me search my trash, downloads and documents folder in the hope of finding something useless rather than dumb. I found the so-called SpotifyLyrics project which showcases this animation:

Spotify app screenshot showing the switch from album cover to lyrics. | Halcyon Mobile

How I re-created this animation through coding

To get started, let me break down the animation into small pieces and explain how I implemented it.

Initial state

Square screenshot of Spotify lyrics explainer box. The title says: Behind the lyrics. There's a lightbulb as a separator, followed by the text: "As far back as 2015, Andrei used the term 'sicko' to '6icko' to refer to his ambitious work ethic". The footer has the source marked as Genius.

source: Spotify

The animation needs 2 views, the aboveView and the belowView. We notice that the aboveView is always at its place and the belowView is always higher and a little bit smaller.

The aboveView will always return the view that is closer to our eyes based on the isFirstViewAbove flag while the belowView will return the view further from our eyes.

class PerspectiveView: UIView {
private let firstView: UIView
private let secondView: UIView
// This will always return the view closer to our eyes
private var aboveView: UIView {
return isFirstViewAbove ? firstView : secondView
}
// This will always return the view further to our eyes
private var belowView: UIView {
return isFirstViewAbove ? secondView : firstView
}
/// The initial "state"
private var isFirstViewAbove = true
// MARK: - Lifecycle
init(frontView: UIView, behindView: UIView) {
self.firstView = frontView
self.secondView = behindView
super.init(frame: .zero)
setupView()
}
private func setupView() {
backgroundColor = .clear
addSubview(firstView)
firstView.snap(to: self)
addSubview(secondView)
secondView.snap(to: self)
setupInitialAnimationState()
}
}

We can achieve the initial state by making some default CGAffineTransforms, one for scaling and another one for translation (moving it up by an arbitrary value of -30).

Click edit button to change this text.

extension CGAffineTransform {
static let scale = CGAffineTransform(scaleX: 0.9, y: 0.9)
static let translation = CGAffineTransform(translationX: 0, y: -30)
}
private extension CGFloat {
static let below: CGFloat = 1
static let above: CGFloat = 2
}
view raw Helpers.swift hosted with ❤ by GitHub

Now, all we have to do is to implement the initialAnimationState function:

private func setupInitialAnimationState() {
frontView.layer.zPosition = .above
behindView.layer.zPosition = .below
behindView.transform = CGAffineTransform.scale.concatenating(.translation)
}

We need to modify each view’s layer zPosition.

  • It is a value [-greatestFiniteMagnitude, greatestFiniteMagnitude] which changes the front to back ordering of the onscreen layers. So a higher value will place the layer closer to our eyes.

The frontView will have the zPosition = .above which is 2 and the behindView will have the zPosition = .below which is 1 and concatenate the scaling transform + translation transform

The user starts dragging

We will need an UIPanGestureRecognizer so that we can manipulate and move the view.

...
private lazy var panGesture = UIPanGestureRecognizer(target: self, action: #selector(handlePan(_:)))
...
@objc private func handlePan(_ sender: UIPanGestureRecognizer) {
// How much did the user swipe
let translation = sender.translation(in: self)
switch sender.state {
case .began, .changed:
break
case .ended:
break
default:
return
}
}

Here we notice there are 4 states:

  1. Only after an amount of drag, does the alpha start fading. We will say that only after the translation.y is greater than 20.
    Spotify app player screenshot, with the album cover moving downwards and the lyrics starting to become visible underneath.
    Add this to .began, .changed cases:
    ...
    case .began, .changed:
    var alphaPercent: CGFloat = 0
    let alphaFadeOffset: CGFloat = 20
    alphaPercent = 1 - (abs(translation.y) - alphaFadeOffset) / 100
    aboveView.transform = CGAffineTransform(translationX: 0, y: translation.y)
    aboveView.alpha = abs(translation.y) > alphaFadeOffset ? alphaPercent : 1.0
    ...
    view raw HandlePan.swift hosted with ❤ by GitHub
    Now we have our aboveView fading its alpha and move based on the translation.y (how much the user dragged on the yAxis), but the view doesn’t return to its initial position when the pan gesture ends.
  2. When the user ends dragging we should make the aboveView return to its initial position.
    Screenshot of Spotify player with the album cover moving downwards and jumping back in place.
    We just apply a transform = .identity on the aboveView and animate its alpha back to 1.
    case .ended:
    handlePanEnded()
    ...
    private func handlePanEnded() {
    UIView.animate(withDuration: 0.33,
    delay: 0.0,
    usingSpringWithDamping: 0.7,
    initialSpringVelocity: 0.7,
    options: .curveEaseOut,
    animations: {
    self.aboveView.transform = .identity
    self.aboveView.alpha = 1.0
    }, completion: nil)
    }
  3. The user drags down and after a dismissalOffset (we can choose an arbitrary value of 80) the aboveView changes its zIndex and transforms concatenating the scaling transform + the translation transform
    Screenshot of the Spotify app player, where the album cover is dragged down and fades to black under the lyrics text box. You can still see the cover peaking behind the lyrics card on the top. | Halcyon Mobile
  4. The user drags up and after a dismissalOffset the aboveView changes its zIndex. We notice that it comes from the bottom of the screen to its initial state.I find this kind of weird and it really makes no sense, so I will implement what I consider to be better UX: coming naturally from where the user dragged more than the dismissalOffset.
    Screenshot of the Spotify player. The album cover slides upwards and fades out, the lyrics card comes up front, and the album cover comes in from down and hides behind the lyrics card. | Halcyon Mobile

    After we animate the alpha in .began, .changed cases, add this:

    // Static dismissalOffset property
    private let dismissalOffset: CGFloat = 80
    case .began, .changed:
    ...
    currentOffset = abs(translation.y)
    guard currentOffset > dismissalOffset else { return }
    panGesture.isEnabled = false
    handleTransition()

    As you may notice, we disable the panGesture. When the currentOffset > dismissalOffset , the .ended case won’t get called.

    All that’s left to do is to implement the handleTransition method:

    private func handleTransition() {
    UIView.animate(withDuration: 0.33, animations: {
    self.aboveView.alpha = 1.0
    self.aboveView.layer.zPosition = .below
    self.aboveView.transform = CGAffineTransform.scale.concatenating(.translation)
    self.belowView.layer.zPosition = .above
    self.belowView.transform = .identity
    }, completion: { _ in
    self.isFirstViewAbove.toggle()
    self.panGesture.isEnabled = true
    })
    }

AboveView

  • we animate the alpha back to 1
  • we change its zIndex = .below
  • and concatenate the scale and the translation transforms

BelowView

  • we change its zIndex = .above
  • and transform it to .identity

In the completion block we toggle() the isFirstViewAbove flag and re-enable the panGesture, and our animation is done.

Final result

Screenshot of Spotify app, specifically of the player screen. The album cover moves downwards and fades, revealing the lyrics on a black card. Then the album cover moves back to hide the lyrics. | Halcyon Mobile

Trying to re-create an animation has always been on my to-do list.

Thanks to ajlkn and his tweet I’ve managed to break the ice between me and my first Medium post.

If you have any questions or want to give me feedback, I’m always available on Twitter. If you’d like to see more of my work, check out my website and GitHub.

via GIPHY

Bonus

The full source code is available on GitHub supporting drag on yAxis andxAxis or on both of them