开发者

Shared Model on Mac and iPhone

I'm currently looking into unifying the model of my application. At the moment I'm using a different model on the mac/iphone. Missing classes (NSAttributedString) and missing technologies (Bindings) made me go with this decision.

With the first restriction gone in SDK 3.2 and my plan to also create an optimized iPad version I'm reconsidering my decision. As I also need to store NSPoints/CGPoints, NSRect/CGRects, NSColor/UIColor and NSImage/UIImage structs/objects in my model I'm not sure what the best way to handle them would be.

Writing my own MNColor object that encapsulates NSColor and UIColor depending on architecture? Writing my own r开发者_运维问答ect-functions that call the appropriate function depending on arch? Or use CGRect in the model on the mac?

Any input would be very appreciated!


CorePlot is an excellent Cocoa plotting framework for iPhone and Mac OS X.
It shares common code across both platforms - Maybe you can get some ideas by browsing their source.

For some cross-plattform issues, CP uses platform specific defines in separate header files to get an "agnostic" image class.
The Xcode project for each platform includes one of the according headers:

  • MacOnly/CPPlatformSpecificDefines.h
  • iPhoneOnly/CPPlatformSpecificDefines.h

They also have a custom color class CPColor. (Based on CGColorRef)

For NSPoint and NSRect, I would use the Core Graphics structs in the model and convert them using NSRectFromCGRect and NSPointFromCGPoint where needed. (Mac OS 10.5+)

A recent CIMGF article also deals with iPhone/Mac incompatibilities:
Creating a NSManagedObject that is Cross Platform


I might be misunderstanding your setup and question but it sounds like you have a insufficiently abstracted data model.

Strictly speaking, "NSPoints/CGPoints, NSRect/CGRects, NSColor/UIColor and NSImage/UIImage structs/objects" are all implementation/UI elements that have nothing to do with the data model. Granted, the API makes it easy to archive these but this lures you into the problem you have now. You're saving objects/structs that are attached to specific hardware and specific implementations and now you can port/reuse them easily.

The better way is to create an abstracted data model that knows nothing about the hardware or the rest of the API. It should store all the NSPoints/CGPoints, NSRect/CGRects as strings or numbers. It should store the colors as numbers, strings or raw data. The images should be stored as raw data.

This way the core of your application i.e. the data it actual manipulates is generic. To display the information, you just need your controller to request the raw data and let the controller convert it the hardware/API specific struct/object.

Core data provides a good example of an abstracted data model. It only stores strings, numbers, dates, booleans etc yet it can store any information of arbitrary complexity for any platform that supports core data.

Even if you don't use Core Data, that is the type of data model you should shoot for.


@"Writing my own rect-functions that call the appropriate function depending on arch" - This will be good.

@"Writing my own MNColor object that encapsulates NSColor and UIColor" - Will be good provided your design of wrapper class is capable of handling MNColor object in cross-platform situations. i.e. The database from mac if imported to iPhone should now be capable of somehow providing UIColor object through your wrapper instead of NSColor.


It depends on your usage but I discourage storing images in a database. They're better off on the filesystem with (perhaps) the paths to the images being stored in the database.

The one case where I can see any gains by having the images stored in the database is if you want one filesystem unit that you can move around that moves everything around. Though with the iPhone this isn't a likely use-case.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜