Skip to content

AcroMace/BabelCamera

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

22 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

BabelCamera

Build Status DUB

Find out how to describe the things around you in another language!

An iOS app using Core ML and the Vision framework in iOS 11 as well as the Google Translate API.

Demo

BabelCamera demo gif

The app also reads the words aloud

Running

  1. Get a Google Translate API Key
  2. Copy Keys.example.xcconfig to Keys.xcconfig and replace YOUR_API_KEY_HERE in the file with your API key from step 1
  3. Clean build folder / clear derived data (⌥⇧⌘K)
  4. Run the app

Vision model

The app currently comes with SqueezeNet. You can replace it with another model available on the Apple's Core ML page (ex. ResNet50, Inception v3, VGG16) by dragging the .mlmodel file (ex. VGG16.mlmodel) into Xcode where you see SqueezeNet.model, and then:

In VisionService.swift, replace the line:

model = try? VNCoreMLModel(for: SqueezeNet().model)

with

model = try? VNCoreMLModel(for: VGG16().model)

or the model of your choice

Languages currently supported

These are languages that can both be translated by Google Translate and pronounced by iOS (listed in Language.swift)

  • Arabic
  • Chinese (Simplified)
  • Chinese (Traditional)
  • Czech
  • Danish
  • Dutch
  • English
  • Finnish
  • French
  • German
  • Greek
  • Hewbrew
  • Hindi
  • Hungarian
  • Indonesian
  • Italian
  • Japanese
  • Korean
  • Norwegian
  • Polish
  • Portuguese
  • Romanian
  • Russian
  • Slovak
  • Spanish
  • Swedish
  • Thai
  • Turkish

About

Find out how to describe the things around you in another language with Core ML and the Vision framework in iOS 11! 👀

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published