출처 - http://shapeace.tistory.com/27

 

 

iPhone Programming for Dummy

1. iPhone SDK 설치


1.1. 시작하기

iPhone SDK 는 Apple 사에서 무료로 배포하고 있는 개발 도구로서, 누구든지 애플 개발 사이트에 등록하면 다운로드 받을 수 있다. 여기에는 Gcc 를 기반으로 한 Xcode 개발 환경과, Mac 에서 모든 것을 테스트 할 수 있는 에뮬레이터, 그리고 실제 장비에서 실행시키면서 수행 성능을 튜닝할 수 있는 성능 측정 프로그램 등 실제 개발 업무에 필요한 모든 것들이 포함되어 있다.

설치하기전, 먼저 다음과 같은 사항이 준비되었는지 확인하자.

  • Intel CPU 기반의 Mac Computer
기본적으로 iPhone SDK 는 Intel 기반의 Mac 에서만 사용할 수 있다. PPC 기반의 Mac 에서 SDK 를 설치한다면,
iPhone 과 관련된 사항은 설치되지 않는다. 그런데, PPC 기반의 Mac 에서도 전혀 불가능 한 것은 아니다.
이것에 대해서는 1.3. PPC 에서 설치하기 부분에서 다시 다루도록 하겠다.

  • 10.5.3 이상의 Mac OS X
이 버전 이하의 Mac OS 에서는 Xcode 가 설치되지 않게 되어 있다. 처음에는 그렇지 않았는데,
SDK Beta 6 부터에서 이런 제약이 생겼다.

  • C 프로그래밍 언어에 대한 기본 이해
정확하게는 Objective-C 언어에 대한 이해가 있어야 한다. Mac 에서 사용하는 Xcode 개발 환경은 기본적으로 Objective-C 언어에 기반하고 있으며, 그 역사는 NeXT Step 시스템으로 까지 거슬러 올라간다. 문제는 Objective-C 언어에 대해 아는 사람이 그리 많지 않다는 것이지만, 다행이도 Objective-C 언어는 C++ 만큼 어려운 언어는 아니다. 이 문서에서는 Objective-C 언어 자체에 대해 깊이 다루는 것은 피하도록 할 것이다. 단, C 언어 자체에 대한 이해조차 없다면 iPhone SDK 를 사용한 개발은 잠시 접어두고 C 언어 공부를 하고 오는 것을 권한다. 이 문서에서도 독자가 C 언어에 익숙하다는 것을 전제로 한다.


1.2. SDK 설치하기

Developer Connection 에 등록된 사람이라면 http://developer.apple.com/iphone/index.action 에서 SDK 를 다운로드 받을 수 있다.

약 1.2 GB 의 dmg 파일이 다운로드된다. 다 받았다면, 당연히 다운받은 dmg 파일을 마운트한다.
마운트하면 다음과 같은 내용이 보이는 디스크 이미지가 나타날 것이다.

두 번째 보이는 iPhone SDK 패키지 파일을 더블 클릭하여 설치를 시작하면 된다. Packages 폴더의 내용은 특별히 볼 필요 없다. 설치 과정에서 이 폴더에 있는 내용들이 로컬 디스크에 설치될 것이다. 단, PPC 환경에서 SDK 를 설치하는 사용자는 나중에 이 폴더를 열어볼 일이 생긴다.

설치 과정은 간단하다. 디스크 볼륨을 선택해주고, 별다른 설정 변경 없이 기본값으로 선택해서 진행하면 모든 것이 OK 이다.
설치 프로그램을 몇 단계 진행한 후 시스템 관리자 암호 창이 뜨고, 여기에 암호를 입력하면 다음 그림과 같은 화면이 나타나면서 설치가 진행된다. 십분 이상을 기다리면 설치가 완료된다.



설치 시간이 생각보다 길다고 불평하는 사람이 있을 지도 모르겠다. 하지만 이것은 단지 iPhone 응용 프로그램만을 만들기 위한 환경이 아니라, Java를 포함해서 Mac OS 용 프로그램을 만들 수 있는 모든 개발 환경이 포함되어 있는 것이다.

소프트웨어 설치가 끝나면, /Developer/Application 아래에 몇 가지 개발 프로그램들이 설치되어 있는 것을 확인할 수 있다.

Intel CPU 기반의 Mac 을 사용하는 사람이라면, 이제 기본적인 준비는 끝마친 셈이다.


1.3. PPC 에서 설치하기

이상의 과정을 모두 거치더라도 불쌍하게도 PowerPC 기반의 Mac을 사용하는 경우에는 iPhone 응용 프로그램 개발을 할 수 없다. 단지 Mac 데스크탑용 프로그램을 개발할 수 있는 Xcode 환경이 설치되어 있을 뿐이다. Apple 에서도 공식적으로는 PPC 기반의 Mac 에서 iPhone 응용 프로그램을 개발할 수 없다고 선언하고 있다.

그러나 방법은 있다. 다음의 과정을 통해서 G4, G5 Mac 에서도 iPhone 프로그램 개발을 해볼 수 있다. 단, 미리 말하지만 몇가지 사용할 수 없는 기능들이 있다고 하는데, 아직 정확하게 파악하지는 못했다. 그러나 기본적인 거의 모든 기능들을 문제없이 사용할 수 있다.

먼저, 설치하기 위해 마운트했던 디스크 이미지로 돌아가보자.
여기에는 앞서 말했듯이 packages 라는 이름의 폴더가 있다. 여기로 들어가 보면 각종 설치 패키지들이 보일 것이다.

SDK 설치 과정에서 자동으로 설치되었던 각종 패키지들이 여기에 있다. 그런데, 설치하면서 보았을 지도 모르겠지만, PPC 사용자들은 iPhone 관련 항목들이 설치되지 않았다. 선택 항목이 비활성화 되어서 임의로 선택할 수도 없었을 것이다.

이 제 할 일은, 여기서 iPhone 으로 시작하는 이름을 가진 5개 패키지를 하나씩 일일이 수동으로 설치하는 것이다. 설치 과정에서 설치할 폴더의 위치를 변경할 수 있는 단계도 나오는데, 일단 5개 모두 기본으로 설치하도록 클릭해서 넘어가도록 한다.

이런 식의 화면을 보면서 일일이 설치해야 한다...

모두 설치했다면, Finder 로 하드디스크의 루트(/) 위치를 살펴보자. 여기에 'Platforms' 라는 폴더가 생겼을 것이다.그리고 그 아래에는 두 개의 폴더가 있는데, 이 두 개의 폴더를 전부 /Developer/Platforms 아래로 이동한다.
겹 치는 폴더와 파일이 많기 때문에, 중복 확인 창이 뜨면 '대치' 하도록 해야 하는데, 여기서 기존 위치의 폴더 속성 문제로 한번에 파일 이동이 되지 않을 것이다. 문제가 되는 부분들을 하나씩 찾아서 하면 큰 문제 없이 모든 파일을 옮길 수 있을 것이다.

이제 힘든 대부분의 과정을 거쳤다.
마지막 단계가 하나 있는데, 일단 아래 경로를 찾아가서 파일들을 확인한다.

/Developer/Platforms/iPhoneSimulator.platform/Developer/Library/Xcode/Specification

꽤나 길다. 최종 목적지에 다다르면 3개의 파일이 있을 것이다.
여기서 iPhone Simulator Architectures.xcspec 이라는 이름의 파일을 텍스트 편집기로 열자.
이 파일의 내용은 아래와 같다. 이것은 iPhone 시뮬레이터용으로 컴파일러가 코드를 생성할 때 어떤 타겟으로 만들 수 있는지에 대한 설정값이다.

 
/**
    iPhone Simulator Architectures.xcspec
   
    Copyright (c) 2008 Apple Inc.  All rights reserved.

    Architecture specifications in the iPhone Simulator platform.
*/
(
    // 32-Bit
    {   Type = Architecture;
        Identifier = Standard;
        Name = "Standard (iPhone Simulator: i386)";
        Description = "32-bit iPhone Simulator architectures";
        ListInEnum = YES;
        SortNumber = 1;
        RealArchitectures = ( i386 );
        ArchitectureSetting = "ARCHS_STANDARD_32_BIT";
    },

    // Old-style Debug
    {    Type = Architecture;
        Identifier = Native;
        Name = "Native Architecture of Build Machine";
        Description = "32-bit for build machine";
        ListInEnum = YES;
        SortNumber = 101;
        ArchitectureSetting = "NATIVE_ARCH";
    },

    // Intel
    {   Type = Architecture;
        Identifier = i386;
        Name = "Intel";
        Description = "32-bit Intel";
        PerArchBuildSettingName = "Intel";
        ByteOrder = little;
        ListInEnum = NO;
        SortNumber = 105;
    },
)

원본 파일은 만약을 위해 각자 편한 방식으로 백업해 두길 권한다.
그리고 이 파일의 내용을 다음과 같이 수정한 후 저장하자.

 /**
    iPhone Simulator Architectures.xcspec
   
    Copyright (c) 2008 Apple Inc.  All rights reserved.

    Architecture specifications in the iPhone Simulator platform.
*/
(
    // 32-Bit
    {   Type = Architecture;
        Identifier = Standard;
        Name = "Standard (iPhone Simulator: ppc)";
        Description = "32-bit iPhone Simulator architectures";
        ListInEnum = YES;
        SortNumber = 1;
        RealArchitectures = ( ppc );
        ArchitectureSetting = "ARCHS_STANDARD_32_BIT";
    },

    // Old-style Debug
    {    Type = Architecture;
        Identifier = Native;
        Name = "Native Architecture of Build Machine";
        Description = "32-bit for build machine";
        ListInEnum = YES;
        SortNumber = 101;
        ArchitectureSetting = "NATIVE_ARCH";
    },

    // G5 32-bit
    {   Type = Architecture;
        Identifier = ppc;
        Name = "PowerPC G5 32-bit";
        Description = "32-bit PowerPC for G5 processor";
        ByteOrder = big;
        ListInEnum = NO;
        SortNumber = 203;
    }
)

이 작업을 하지 않으면 코드를 작성하더라도 에뮬레이터를 사용할 수 없다. 이렇게 힘든 과정을 모두 통과하였다면 PPC 사용자들도 이제 iPhone SDK 를 사용할 자격을 비로소 얻게 된다.


자, 이제 모두 출발선에 섰다. 계속 달려갈 결심을 하는 일 만 남은 셈이다.
Intel 이건 PPC 건, 이제 구분하지 말고 Xcode 의 얼굴을 구경해보자. Xcode 가 어디있냐고? 보나마나 /Developer 아래 어딘가에 있겠지만 알게 뭔가. 그냥 Spotlight 에서 Xcode 라고 타이핑 한 후 Enter 키를 눌러보면 되지. 그리고 나서 Xcode 는 그냥 Dock 에 등록해 버리자.



환 영합니다. Dummy 여러분.

 



출처 : http://blog.naver.com/PostView.nhn?blogId=seogi1004&logNo=110086946825
Posted by 오늘마감

Xcode Objective-C Text Macro Cheatsheet For iPhone Programmers

 





출처 : http://blog.naver.com/PostView.nhn?blogId=hextrial&logNo=60119241862
Posted by 오늘마감
[아이폰 앱 개발] OpenGL ES for iPhone : Part 3 with Accelerometer control

OpenGL ES for iPhone : Part 3 with Accelerometer control

In this part 3, we will add the accelerometer control to move the position of ellipse object that we have created in part 2 of the Tutorial.



1) UIAccelerometerDelegate
We need to add the UIAccelerometerDelegate protocol to the EAGLView and implement the accelerometer: didAccelerate: method as below


@interface EAGLView : UIView <UIAccelerometerDelegate>

- (void)accelerometer:(UIAccelerometer*)accelerometer didAccelerate:(UIAcceleration*)acceleration


We need to configure and start the accelerometer in the setupView method

[[UIAccelerometer sharedAccelerometer] setUpdateInterval:(1.0 / kAccelerometerFrequency)];
[[UIAccelerometer sharedAccelerometer] setDelegate:self];


2) Accelerometer values
Inside the accelerometer: didAccelerate: method, we add a low-pass filter in the accelerometer values. This low-pass filter codes are sourced from the GLGravity Sample Code from Apple.

//Use a basic low-pass filter in the accelerometer values
accel[0] = acceleration.x * kFilteringFactor + accel[0] * (1.0 - kFilteringFactor);
accel[1] = acceleration.y * kFilteringFactor + accel[1] * (1.0 - kFilteringFactor);
accel[2] = acceleration.z * kFilteringFactor + accel[2] * (1.0 - kFilteringFactor);


The meaning of accelerometer values:

acceleration.x = Roll. It corresponds to roll, or rotation around the axis that runs from your home button to your earpiece. Values vary from 1.0 (rolled all the way to the right) to -1.0 (rolled all the way to the left).

acceleration.y = Pitch. Place your iPhone on the table and mentally draw a horizontal line about half-way down the screen. That's the axis around which the Y value rotates. Values go from 1.0 (the headphone jack straight down) to -1.0 (the headphone jack straight up).

acceleration.z = Face up/face down. It refers to whether your iPhone is face up (-1.0) or face down (1.0). When placed on it side, either the side with the volume controls and ringer switch, or the side directly opposite, the Z value equates to 0.0.

3) Control on movement of the ellipse is using the variables moveX and moveY and the ellipse position will be changed according to acceleration.x (that is accel[0]) and acceleration.y (that is accel[1]) values that passed from the Accelerometer control after the low-pass filter. The larger the absolute value of acceleration.x/acceleration.y, the greater for the magnitude for the value of moveX/moveY and thus the faster the ellipse will change its position to that direction. As the object should not move beyond the screen view, the ellipseData.pos.x and ellipseData.pos.y values will be governed by the boundaries of the screen.

 ellipseData.pos.x += moveX;
 if (accel[0] > -0.1 & accel[0] < 0.1 ) {
   moveX = 0.0f;
 }
 else {
  moveX = 10.0f * accel[0];
 }

 ellipseData.pos.y += moveY;
 if (accel[1] > -0.1 & accel[1] < 0.1 ) {
   moveY = 0.0f;
 }
 else {
   moveY = -10.0f * accel[1];
 }


4) Conditional compilation code for the iPhone Simulator and on-screen debug info
As iPhone Simulator does not have Accelerometer control, we have added the code that will change the ellipse position inside this compiler directive, so that the ellipse will keep moving on the iPhone Simulator.
  #if TARGET_IPHONE_SIMULATOR 

Moroever, we have added a UILabel to the code so that we can read the Accelerometer values while we debug the program on actual device. This UILabel can be disabled using this define directive.
  #undef DEBUGSCREEN

5) The source codes are here, you just need to create a new project from OpenGL ES Application template of XCode and copy the source codes of EAGLView.h and EAGLView.m from below and paste them for Build & Go in XCode. The accelerometer control can only be tested on actual device.



EAGLView.h Select all

// EAGLView.h
// OpenGL ES Tutorial - Part 3 by javacom


// To enable Debug NSLog, add GCC_PREPROCESSOR_DEFINITIONS DEBUGON in Project Settings for Debug Build Only and replace NSLog() with DEBUGLOG()
#ifdef DEBUGON
#define DEBUGLOG if (DEBUGON) NSLog
#else
#define DEBUGLOG
#endif

#define DEBUGSCREEN

#import <UIKit/UIKit.h>
#import <OpenGLES/EAGL.h>
#import <OpenGLES/ES1/gl.h>
#import <OpenGLES/ES1/glext.h>

typedef struct
{
BOOL rotstop; // stop self rotation
BOOL touchInside; // finger tap inside of the object ?
BOOL scalestart; // start to scale the obejct ?
CGPoint pos; // position of the object on the screen
CGPoint startTouchPosition; // Start Touch Position
CGPoint currentTouchPosition; // Current Touch Position
GLfloat pinchDistance; // distance between two fingers pinch
GLfloat pinchDistanceShown; // distance that have shown on screen
GLfloat scale; // OpenGL scale factor of the object
GLfloat rotation; // OpenGL rotation factor of the object
GLfloat rotspeed; // control rotation speed of the object
} ObjectData;

/*
This class wraps the CAEAGLLayer from CoreAnimation into a convenient UIView subclass.
The view content is basically an EAGL surface you render your OpenGL scene into.
Note that setting the view non-opaque will only work if the EAGL surface has an alpha channel.
*/
@interface EAGLView : UIView {

@private
/* The pixel dimensions of the backbuffer */
GLint backingWidth;
GLint backingHeight;

EAGLContext *context;

/* OpenGL names for the renderbuffer and framebuffers used to render to this view */
GLuint viewRenderbuffer, viewFramebuffer;

/* OpenGL name for the depth buffer that is attached to viewFramebuffer, if it exists (0 if it does not exist) */
GLuint depthRenderbuffer;

NSTimer *animationTimer;
NSTimeInterval animationInterval;

@public
ObjectData squareData;
ObjectData ellipseData;
GLfloat ellipseVertices[720];
CGFloat initialDistance;
UIAccelerationValue accel[3];
GLfloat moveX, moveY;
#ifdef DEBUGSCREEN
UILabel *textView;
#endif
}

@property NSTimeInterval animationInterval;

@property (nonatomic) ObjectData squareData;
@property (nonatomic) ObjectData ellipseData;
@property CGFloat initialDistance;
#ifdef DEBUGSCREEN
@property (nonatomic, assign) UILabel *textView;
#endif

- (void)startAnimation;
- (void)stopAnimation;
- (void)drawView;
- (void)setupView;

@end


EAGLView.m Select all

// EAGLView.m
// OpenGL ES Tutorial - Part 3 by javacom
//
#import <QuartzCore/QuartzCore.h>
#import <OpenGLES/EAGLDrawable.h>

#import "EAGLView.h"

#include <math.h>

// Macros
#define degreesToRadians(__ANGLE__) (M_PI * (__ANGLE__) / 180.0)
#define radiansToDegrees(__ANGLE__) (180.0 * (__ANGLE__) / M_PI)

CGFloat distanceBetweenPoints (CGPoint first, CGPoint second) {
CGFloat deltaX = second.x - first.x;
CGFloat deltaY = second.y - first.y;
return sqrt(deltaX*deltaX + deltaY*deltaY );
};

CGFloat angleBetweenPoints(CGPoint first, CGPoint second) {
// atan((top - bottom)/(right - left))
CGFloat rads = atan((second.y - first.y) / (first.x - second.x));
return radiansToDegrees(rads);
}

CGFloat angleBetweenLines(CGPoint line1Start, CGPoint line1End, CGPoint line2Start, CGPoint line2End) {

CGFloat a = line1End.x - line1Start.x;
CGFloat b = line1End.y - line1Start.y;
CGFloat c = line2End.x - line2Start.x;
CGFloat d = line2End.y - line2Start.y;

CGFloat rads = acos(((a*c) + (b*d)) / ((sqrt(a*a + b*b)) * (sqrt(c*c + d*d))));

return radiansToDegrees(rads);
}

#define USE_DEPTH_BUFFER 0

// CONSTANTS
#define kMinimumTouchLength 30
#define kMaximumScale 7.0f
#define kMinimumPinchDelta 15
#define kAccelerometerFrequency 100.0 // Hz
#define kFilteringFactor 0.1


// A class extension to declare private methods
@interface EAGLView ()

@property (nonatomic, retain) EAGLContext *context;
@property (nonatomic, assign) NSTimer *animationTimer;

- (BOOL) createFramebuffer;
- (void) destroyFramebuffer;

@end


@implementation EAGLView

@synthesize context;
@synthesize animationTimer;
@synthesize animationInterval;
@synthesize squareData;
@synthesize ellipseData;
@synthesize initialDistance;
#ifdef DEBUGSCREEN
@synthesize textView;
#endif

// You must implement this method
+ (Class)layerClass {
return [CAEAGLLayer class];
}


//The GL view is stored in the nib file. When it's unarchived it's sent -initWithCoder:
- (id)initWithCoder:(NSCoder*)coder {

if ((self = [super initWithCoder:coder])) {

// Get the layer
CAEAGLLayer *eaglLayer = (CAEAGLLayer *)self.layer;

eaglLayer.opaque = YES;
eaglLayer.drawableProperties = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithBool:NO], kEAGLDrawablePropertyRetainedBacking, kEAGLColorFormatRGBA8, kEAGLDrawablePropertyColorFormat, nil];

context = [[EAGLContext alloc] initWithAPI:kEAGLRenderingAPIOpenGLES1];

if (!context || ![EAGLContext setCurrentContext:context]) {
[self release];
return nil;
}

animationInterval = 1.0 / 60.0;
[self setupView];
}
return self;
}

// These are four methods touchesBegan, touchesMoved, touchesEnded, touchesCancelled and use to notify about touches and gestures

- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
/*
NSUInteger numTaps = [[touches anyObject] tapCount]; // number of taps
NSUInteger numTouches = [touches count]; // number of touches
*/
UITouch *touch = [[touches allObjects] objectAtIndex:0];

DEBUGLOG(@"TouchBegan event counts = %d ",[[event touchesForView:self] count]);
DEBUGLOG(@"TouchBegan tounches counts = %d ",[touches count]);
if ([touches count]== 2) {
NSArray *twoTouches = [touches allObjects];
UITouch *first = [twoTouches objectAtIndex:0];
UITouch *second = [twoTouches objectAtIndex:1];
initialDistance = distanceBetweenPoints([first locationInView:self], [second locationInView:self]);
squareData.rotstop = YES;
squareData.touchInside = NO;
}
else if ([touches count]==[[event touchesForView:self] count] & [[event touchesForView:self] count] == 1) {
squareData.startTouchPosition = [touch locationInView:self];
if (distanceBetweenPoints([touch locationInView:self], squareData.pos) <= kMinimumTouchLength * squareData.scale) {
DEBUGLOG(@"Square Touch at %.2f, %.2f ",squareData.pos.x,squareData.pos.y);
squareData.rotstop = YES;
squareData.touchInside = YES;
}
}

}

- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [[touches allObjects] objectAtIndex:0];
squareData.currentTouchPosition = [touch locationInView:self];
if ([touches count]== 2) {
NSArray *twoTouches = [touches allObjects];
UITouch *first = [twoTouches objectAtIndex:0];
UITouch *second = [twoTouches objectAtIndex:1];

// Calculate the distance bewtween the two fingers(touches) to determine the pinch distance
CGFloat currentDistance = distanceBetweenPoints([first locationInView:self], [second locationInView:self]);

squareData.rotstop = YES;
squareData.touchInside = NO;

if (initialDistance == 0.0f)
initialDistance = currentDistance;
if (currentDistance - initialDistance > kMinimumPinchDelta) {
squareData.pinchDistance = currentDistance - initialDistance;
squareData.scalestart = YES;
DEBUGLOG(@"Outward Pinch %.2f", squareData.pinchDistance);
}
else if (initialDistance - currentDistance > kMinimumPinchDelta) {
squareData.pinchDistance = currentDistance - initialDistance;
squareData.scalestart = YES;
DEBUGLOG(@"Inward Pinch %.2f", squareData.pinchDistance);
}
}
else if ([touches count]==[[event touchesForView:self] count] & [[event touchesForView:self] count] == 1) {
if (squareData.touchInside) {
// Only move the square to new position when touchBegan is inside the square
squareData.pos.x = [touch locationInView:self].x;
squareData.pos.y = [touch locationInView:self].y;
DEBUGLOG(@"Square Move to %.2f, %.2f ",squareData.pos.x,squareData.pos.y);
squareData.rotstop = YES;
}
}
}


- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
if ([touches count] == [[event touchesForView:self] count]) {
initialDistance = squareData.pinchDistanceShown = squareData.pinchDistance = 0.0f;
squareData.rotstop = squareData.touchInside = squareData.scalestart = NO;
DEBUGLOG(@"touchesEnded, all fingers up");
}
else {
initialDistance = squareData.pinchDistanceShown = squareData.pinchDistance = 0.0f;
squareData.scalestart = NO;
DEBUGLOG(@"touchesEnded");
}
}


- (void)touchesCancelled:(NSSet *)touches withEvent:(UIEvent *)event {
initialDistance = squareData.pinchDistanceShown = squareData.pinchDistance = 0.0f;
squareData.rotstop = squareData.touchInside = squareData.scalestart = NO;
DEBUGLOG(@"touchesCancelled");
}

- (void)setupView { // new method for intialisation of variables and states

// Enable Multi Touch of the view
self.multipleTouchEnabled = YES;

//Configure and start accelerometer
[[UIAccelerometer sharedAccelerometer] setUpdateInterval:(1.0 / kAccelerometerFrequency)];
[[UIAccelerometer sharedAccelerometer] setDelegate:self];
#if TARGET_IPHONE_SIMULATOR
moveX = 2.0f;
moveY = 3.0f;
#else
moveX = 0.0f;
moveY = 0.0f;
#endif

#ifdef DEBUGSCREEN
UIColor *bgColor = [[UIColor alloc] initWithWhite:1.0f alpha:0.0f];
textView = [[UILabel alloc] initWithFrame:CGRectMake(10.0f, 350.0f, 300.0f, 96.0f)];
textView.text = [NSString stringWithFormat:@"-Accelerometer Data-"];
textView.textAlignment = UITextAlignmentLeft;
[textView setNumberOfLines:4];
textView.backgroundColor = bgColor;
textView.font = [UIFont fontWithName:@"Arial" size:18];
[self addSubview:textView];
[self bringSubviewToFront:textView];
#endif


// Initialise square data
squareData.rotation = squareData.pinchDistance = squareData.pinchDistanceShown = 0.0f;
ellipseData.rotation = 0.0f;
squareData.scale = 1.0f;
squareData.rotstop = squareData.touchInside = squareData.scalestart = NO;
squareData.pos.x = 160.0f;
squareData.pos.y = 240.0f;
squareData.pinchDistance = 0.0f;
squareData.rotspeed = 1.0f;

// Initialise ellipse data
ellipseData.rotation = 0.0f;
ellipseData.rotstop = ellipseData.touchInside = ellipseData.scalestart = NO;
ellipseData.pos.x = 160.0f;
ellipseData.pos.y = 100.0f;
ellipseData.rotspeed = -4.0f;

// calculate the vertices of ellipse
const GLfloat xradius = 35.0f;
const GLfloat yradius = 25.0f;
for (int i = 0; i < 720; i+=2) {
ellipseVertices[i] = (cos(degreesToRadians(i)) * xradius) + 0.0f;
ellipseVertices[i+1] = (sin(degreesToRadians(i)) * yradius) + 0.0f;
// DEBUGLOG(@"ellipseVertices[v%d] %.1f, %.1f",i, ellipseVertices[i], ellipseVertices[i+1]);
}

// setup the projection matrix
glMatrixMode(GL_PROJECTION);
glLoadIdentity();

// Setup Orthographic Projection for the 320 x 480 of the iPhone screen
glOrthof(0.0f, 320.0f, 480.0f, 0.0f, -1.0f, 1.0f);
glMatrixMode(GL_MODELVIEW);

}

- (void)drawView {

// Define the square vertices
const GLfloat squareVertices[] = {
-20.0f, -20.0f,
20.0f, -20.0f,
-20.0f, 20.0f,
20.0f, 20.0f,
};

// Define the colors of the square vertices
const GLubyte squareColors[] = {
255, 255, 0, 255,
0, 255, 255, 255,
0, 0, 0, 0,
255, 0, 255, 255,
};


// Define the colors of the ellipse vertices
const GLubyte ellipseColors[] = {
233, 85, 85, 255,
233, 85, 85, 255,
233, 85, 85, 255,
233, 85, 85, 255,
233, 85, 85, 255,
};


[EAGLContext setCurrentContext:context];
glBindFramebufferOES(GL_FRAMEBUFFER_OES, viewFramebuffer);
glViewport(0, 0, backingWidth, backingHeight);

// Clear background color
glClearColor(0.5f, 0.5f, 0.5f, 1.0f);
glClear(GL_COLOR_BUFFER_BIT);

// draw the square
glLoadIdentity();
glTranslatef(squareData.pos.x, squareData.pos.y, 0.0f);
glRotatef(squareData.rotation, 0.0f, 0.0f, 1.0f);
glScalef(squareData.scale, squareData.scale, 1.0f);
glVertexPointer(2, GL_FLOAT, 0, squareVertices);
glColorPointer(4, GL_UNSIGNED_BYTE, 0, squareColors);
glEnableClientState(GL_VERTEX_ARRAY);
glEnableClientState(GL_COLOR_ARRAY);
glDrawArrays(GL_TRIANGLE_STRIP, 0, 4);

// draw the ellipse
glLoadIdentity();
glTranslatef(ellipseData.pos.x, ellipseData.pos.y, 0.0f);
glRotatef(ellipseData.rotation, 0.0f, 0.0f, 1.0f);
glVertexPointer(2, GL_FLOAT, 0, ellipseVertices);
glColorPointer(4, GL_UNSIGNED_BYTE, 0, ellipseColors);
glEnableClientState(GL_VERTEX_ARRAY);
glEnableClientState(GL_COLOR_ARRAY);
glDrawArrays(GL_TRIANGLE_FAN, 0, 360); // the ellipse has 360 vertices

// control the square rotation
if (!squareData.rotstop) {
squareData.rotation += squareData.rotspeed;
if(squareData.rotation > 360.0f)
squareData.rotation -= 360.0f;
else if(squareData.rotation < -360.0f)
squareData.rotation += 360.0f;
}

// control the ellipse rotation
if (!ellipseData.rotstop) {
ellipseData.rotation += ellipseData.rotspeed;
if(ellipseData.rotation > 360.0f)
ellipseData.rotation -= 360.0f;
else if(ellipseData.rotation < -360.0f)
ellipseData.rotation += 360.0f;
}

// control the square scaling
if (squareData.scalestart && squareData.scale <= kMaximumScale) {
GLfloat pinchDelta = squareData.pinchDistance - squareData.pinchDistanceShown;
if (squareData.pinchDistance != 0.0f) {
squareData.scale += pinchDelta/30;
squareData.pinchDistanceShown = squareData.pinchDistance;
if (squareData.scale >= kMaximumScale) {
squareData.scale = kMaximumScale;
squareData.pinchDistanceShown = squareData.pinchDistance = 0.0f;
squareData.scalestart = NO;
} else if (squareData.scale <= 1.0f) {
squareData.scale = 1.0f;
squareData.pinchDistanceShown = squareData.pinchDistance = 0.0f;
squareData.scalestart = NO;
}
DEBUGLOG(@"scale is %.2f",squareData.scale);
}
}

// control the ellipse movement
#if TARGET_IPHONE_SIMULATOR
ellipseData.pos.x += moveX;
if (ellipseData.pos.x >= 290.f) {
moveX = -2.0f;
}
else if (ellipseData.pos.x <= 30.f) {
moveX = 2.0f;
}

ellipseData.pos.y += moveY;
if (ellipseData.pos.y >= 450.f) {
moveY = -1.5f;
}
else if (ellipseData.pos.y <= 55.f) {
moveY = 3.5f;
}
#else
ellipseData.pos.x += moveX;
if (accel[0] > -0.1 & accel[0] < 0.1 ) {
moveX = 0.0f;
}
else {
moveX = 10.0f * accel[0];
}

ellipseData.pos.y += moveY;
if (accel[1] > -0.1 & accel[1] < 0.1 ) {
moveY = 0.0f;
}
else {
moveY = -10.0f * accel[1];
}
#endif
if (ellipseData.pos.x >= 290.f) {
ellipseData.pos.x = 290.0f;
}
else if (ellipseData.pos.x <= 30.f) {
ellipseData.pos.x = 30.0f;
}
if (ellipseData.pos.y >= 450.f) {
ellipseData.pos.y = 450.0f;
}
else if (ellipseData.pos.y <= 55.f) {
ellipseData.pos.y = 55.0f;
}


glBindRenderbufferOES(GL_RENDERBUFFER_OES, viewRenderbuffer);
[context presentRenderbuffer:GL_RENDERBUFFER_OES];
}

- (void)accelerometer:(UIAccelerometer*)accelerometer didAccelerate:(UIAcceleration*)acceleration
{
/*
The meaning of acceleration values for firmware 2.x
acceleration.x = Roll. It corresponds to roll, or rotation around the axis that runs from your home button to your earpiece.
Values vary from 1.0 (rolled all the way to the right) to -1.0 (rolled all the way to the left).

acceleration.y = Pitch. Place your iPhone on the table and mentally draw a horizontal line about half-way down the screen.
That's the axis around which the Y value rotates.
Values go from 1.0 (the headphone jack straight down) to -1.0 (the headphone jack straight up).

acceleration.z = Face up/face down.
It refers to whether your iPhone is face up (-1.0) or face down (1.0).
When placed on it side, either the side with the volume controls and ringer switch, or the side directly opposite
, the Z value equates to 0.0.
*/

//Use a basic low-pass filter in the accelerometer values
accel[0] = acceleration.x * kFilteringFactor + accel[0] * (1.0 - kFilteringFactor);
accel[1] = acceleration.y * kFilteringFactor + accel[1] * (1.0 - kFilteringFactor);
accel[2] = acceleration.z * kFilteringFactor + accel[2] * (1.0 - kFilteringFactor);

#ifdef DEBUGSCREEN
textView.text = [NSString stringWithFormat:
@"X (roll, %4.1f%%): %f\nY (pitch %4.1f%%): %f\nZ (%4.1f%%) : %f",
100.0 - (accel[0] + 1.0) * 50.0, accel[0],
100.0 - (accel[1] + 1.0) * 50.0, accel[1],
100.0 - (accel[2] + 1.0) * 50.0, accel[2]
];
#endif
}

- (void)layoutSubviews {
[EAGLContext setCurrentContext:context];
[self destroyFramebuffer];
[self createFramebuffer];
[self drawView];
}


- (BOOL)createFramebuffer {

glGenFramebuffersOES(1, &viewFramebuffer);
glGenRenderbuffersOES(1, &viewRenderbuffer);

glBindFramebufferOES(GL_FRAMEBUFFER_OES, viewFramebuffer);
glBindRenderbufferOES(GL_RENDERBUFFER_OES, viewRenderbuffer);
[context renderbufferStorage:GL_RENDERBUFFER_OES fromDrawable:(CAEAGLLayer*)self.layer];
glFramebufferRenderbufferOES(GL_FRAMEBUFFER_OES, GL_COLOR_ATTACHMENT0_OES, GL_RENDERBUFFER_OES, viewRenderbuffer);

glGetRenderbufferParameterivOES(GL_RENDERBUFFER_OES, GL_RENDERBUFFER_WIDTH_OES, &backingWidth);
glGetRenderbufferParameterivOES(GL_RENDERBUFFER_OES, GL_RENDERBUFFER_HEIGHT_OES, &backingHeight);

if (USE_DEPTH_BUFFER) {
glGenRenderbuffersOES(1, &depthRenderbuffer);
glBindRenderbufferOES(GL_RENDERBUFFER_OES, depthRenderbuffer);
glRenderbufferStorageOES(GL_RENDERBUFFER_OES, GL_DEPTH_COMPONENT16_OES, backingWidth, backingHeight);
glFramebufferRenderbufferOES(GL_FRAMEBUFFER_OES, GL_DEPTH_ATTACHMENT_OES, GL_RENDERBUFFER_OES, depthRenderbuffer);
}

if(glCheckFramebufferStatusOES(GL_FRAMEBUFFER_OES) != GL_FRAMEBUFFER_COMPLETE_OES) {
DEBUGLOG(@"failed to make complete framebuffer object %x", glCheckFramebufferStatusOES(GL_FRAMEBUFFER_OES));
return NO;
}

return YES;
}


- (void)destroyFramebuffer {

glDeleteFramebuffersOES(1, &viewFramebuffer);
viewFramebuffer = 0;
glDeleteRenderbuffersOES(1, &viewRenderbuffer);
viewRenderbuffer = 0;

if(depthRenderbuffer) {
glDeleteRenderbuffersOES(1, &depthRenderbuffer);
depthRenderbuffer = 0;
}
}


- (void)startAnimation {
self.animationTimer = [NSTimer scheduledTimerWithTimeInterval:animationInterval target:self selector:@selector(drawView) userInfo:nil repeats:YES];
}


- (void)stopAnimation {
self.animationTimer = nil;
}


- (void)setAnimationTimer:(NSTimer *)newTimer {
[animationTimer invalidate];
animationTimer = newTimer;
}


- (void)setAnimationInterval:(NSTimeInterval)interval {

animationInterval = interval;
if (animationTimer) {
[self stopAnimation];
[self startAnimation];
}
}


- (void)dealloc {

[self stopAnimation];

if ([EAGLContext currentContext] == context) {
[EAGLContext setCurrentContext:nil];
}

[context release];
[super dealloc];
}

@end

.http://iphonesdkdev.blogspot.com/2009/04/opengl-es-for-iphone-part-3-with.html
.
Posted by 오늘마감
[아이폰 앱 개발] Cocoa for Scientists (XXIV): Core Animation First Steps

Cocoa for Scientists (XXIV): Core Animation First Steps

Author: Drew McCormack
Web Site: www.mentalfaculty.com

Arguably the most important change in Leopard was not a user feature, but a developer one: Core Animation. Over the coming years, the way applications look and react to user interaction will change dramatically, and that will be largely due to the ease with which interfaces can be animated with Core Animation. Sure, there will also be overt eye candy, but there will also be lots of more subtle changes to application interfaces that truly benefit the user, giving useful feedback.

Core Animation is obviously an enormous boon for user interface developers, but it has other uses too, such as visualization. There are many scientific applications for which Core Animation could be very useful. In the next few tutorials, I want to introduce Core Animation, and show its potential for scientific visualization. I won’t be showing you how to create the CoverFlow effect, or reprogram the Front Row application, but will hopefully introduce you to a whole new way of looking at Core Animation.

What is Core Animation?

It is easy to get confused about what exactly Core Animation is, because it is an umbrella for several different types of functionality. And what makes it more confusing is that developers can use it implicitly with CocoaNSView objects to animate their UIs. These views are called layer-backed views, and strictly speaking are not part of Core Animation itself, but do make use of it.

In addition to programming in Cocoa with layer-backed views, you can also program directly with Core Animation classes. Core Animation combines a number of different aspects. First there is animation: Core Animation can animate a property (e.g. position, opacity, orientation) in time. It performs this animation on a dedicated thread, so the animation goes on even if the hardware can’t keep up — Core Animation will simply drop frames to make sure that the animation finishes on time.

The second important aspect of Core Animation is layering. (While still in the pre-release phase, Core Animation was even called ‘Layer Kit’, and Core Animation classes had an ‘LK’ prefix.) Layers are a bit like Cocoa views, but they exist in three-dimensional space. They are rectangular, and can hold assorted content, such as OpenGL renderings, QuickTime movies, images, and text. Each layer can hold different content, and can be superimposed, meaning you can effectively combine different types of content in a single view. For example, you could place some controls on top of a playing QuickTime movie, or have an 2D image appear next to an OpenGL rendering.

So Core Animation is not simply animation, but layers and animation. And the two work beautifully in harmony. For example, set the position of a layer and it animates to the new position, all the while presenting its content. A QuickTime movie will continue to play, and an OpenGL view continue to animate (if it is an animated rendering).

What is it not?

The 3D nature of Core Animation can also be a bit confusing. Don’t we already have that? Isn’t it called OpenGL?

Core Animation is not a 3D engine, and in that sense it should probably be called 2.5D. Layers do have a position along the Z axis, which comes out of the screen, but if two layers intersect, Core Animation will not do the math to make sure things look ‘right’. Better not to let layers intersect.

Another thing to remember is that layers are not polygons. In OpenGL, it is easy to build up an arbitrary surface out of polygonal pieces, but layers are rectangular, and cannot represent arbitrary surfaces. Layers provide a 2D canvas that moves in a 3D space, and are not appropriate for representing true 3D objects. For that you need OpenGL.

The Fleas on the Fleas

To introduce you to Core Animation, I’m going to develop a simple application called ‘Flea on Flea’. When complete, this app will have lots of Core Animation layers — the fleas — moving around on parent layers — the other fleas — in a recursively animated collage.

Sound enticing? Well, you will have to wait, because this week we are only going to get up to animating a simple square on a black background. Although this might not sound too exciting, at the end of the tutorial, you will already have seen many of the most important aspects of Core Animation programming.

Before embarking on the tutorial proper, I suggest you download the finished app, and see how it works. There is no interaction in the application — it is simply an animated scene. If you want to follow along, download the source code too.

The Layer Hosting View

The first thing you need to do before you can start generating and animating layers is to provide a container for them in the user interface. This is just an ordinary NSView that has a special backing layer, and is known as the hosting view. You can see how such a hosting view can be configured in the setupHostView method of the Flea on Flea controller class FFController.

-(void)setupHostView { CALayer *layer = [CALayer layer]; CGColorSpaceRef colorSpace = CGColorSpaceCreateWithName(kCGColorSpaceGenericRGB); CGFloat components[4] = {0.0f, 0.0f, 0.0f, 1.0f}; CGColorRef blackColor = CGColorCreate(colorSpace, components); layer.backgroundColor = blackColor; [hostView setLayer:layer]; [hostView setWantsLayer:YES]; CGColorRelease(blackColor); CGColorSpaceRelease(colorSpace); }

A layer in Core Animation is represented by the CALayer class. To use the Core Animation classes, you need to add the QuartzCore framework to your project, and import the QuartzCore.h header.

#import <QuartzCore/QuartzCore.h>

You can use a vanilla CALayer instance as the backing layer, as shown above, but there are also several different subclasses of CALayer, which could be useful if you need to render something more substantial in your hosting view. For example, if you want to have some OpenGL content in the hosting view, back it with a CAOpenGLLayer. (Note that just because you use a CAOpenGLLayer, does not mean your view has to be an NSOpenGLView. In general, you should just use a plain NSView object, and it should not do any drawing of its own.)

Setting the backing layer of the hosting view is simple: you just use the setLayer: method, and make sure that you call setWantsLayer: passing in the argument YES.

[hostView setLayer:layer]; [hostView setWantsLayer:YES];

That’s often all there is to creating a backing layer, but in Flea on Flea we want to set the background color to black. To do that, we create a CGColor and set the backgroundColor property of the layer.

CGColorSpaceRef colorSpace = CGColorSpaceCreateWithName(kCGColorSpaceGenericRGB); CGFloat components[4] = {0.0f, 0.0f, 0.0f, 1.0f}; CGColorRef blackColor = CGColorCreate(colorSpace, components); layer.backgroundColor = blackColor; ... CGColorRelease(blackColor); CGColorSpaceRelease(colorSpace);

The QuartzCore framework is quite low level, so you usually have to work with Core Graphics types and primitives, rather than Cocoa objects. For example, in the code above, a CGColor is created, rather than anNSColor. This can be a bit ungainly, because you have to worry about memory management and old-fashioned stuff like that, but you soon get used to it. Just remember that when you create a Core Graphics type, make sure you release it when you are finished with it.

Adding Sublayers

With a hosting view in place, we can now add sublayers that move around in the host. Flea on Flea uses thecreateFleaLayerInLayer: method for this, which is called once from awakeFromNib.

-(void)awakeFromNib { [self setupHostView]; [self createFleaLayerInLayer:hostView.layer]; ... }

createFleaLayerInLayer: adds a sublayer to the layer passed in, which in this case is the host view backing layer.

-(void)createFleaLayerInLayer:(CALayer *)parentLayer { CGColorSpaceRef colorSpace = CGColorSpaceCreateWithName(kCGColorSpaceGenericRGB); CALayer *layer = [CALayer layer]; CGRect frame = parentLayer.bounds; frame.origin.x += 20.0f; frame.size.width = parentLayer.bounds.size.width / 10.0f; frame.size.height = frame.size.width; layer.frame = frame; CGFloat components[4] = {1.0f, 1.0f, 1.0f, 1.0f}; CGColorRef whiteColor = CGColorCreate(colorSpace, components); layer.backgroundColor = whiteColor; [parentLayer addSublayer:layer]; CGColorRelease(whiteColor); CGColorSpaceRelease(colorSpace); }

The new layer is created in the same way as the backing layer was, and the background color set to white. A layer has similar geometric properties to an NSView, such as a frame, which gives the size and position of the layer in its parent’s coordinate system, and bounds, which delineate positions in the layers own coordinates. IncreateFleaLayerInLayer:, the bounds of the parent layer are used to size the new sublayer. The new layer is made a tenth the size of the width of the parent layer, and positioned to the right of the parent layer’s origin (in the lower-left corner).

CGRect frame = parentLayer.bounds; frame.origin.x += 20.0f; frame.size.width = parentLayer.bounds.size.width / 10.0f; frame.size.height = frame.size.width; layer.frame = frame;

The addSublayer: method adds the new layer to the parent.

[parentLayer addSublayer:layer];

If you run Flea on Flea with only this code in place, it will draw a white square on a black background, but nothing will change. In the next section, we will see how to animate the square.

Animating Layers

In the very simple Flea on Flea example, the white sublayer will be made to move to random locations, as well as scale and rotate. There will be no user interaction, but the layer could easily be made to respond to mouse clicks or some other interaction.

In the awakeFromNib method, a timer is started to repeatedly invoke the changeDestination method every 2 seconds.

-(void)awakeFromNib { ... [NSTimer scheduledTimerWithTimeInterval:2.0 target:self selector:@selector(changeDestination) userInfo:nil repeats:YES]; }

The changeDestination method chooses random values for the position of the sublayer, and its orientation, and starts an animation to move the layer toward those destination values.

-(void)changeDestination { [CATransaction begin]; [CATransaction setValue:[NSNumber numberWithFloat:3.0f] forKey:kCATransactionAnimationDuration]; CALayer *layer = hostView.layer.sublayers.lastObject; layer.position = CGPointMake(hostView.bounds.size.width * rand()/(CGFloat)RAND_MAX, hostView.bounds.size.height * rand()/(CGFloat)RAND_MAX); CGFloat factor = rand()/(CGFloat)RAND_MAX * 2.0f; CATransform3D transform = CATransform3DMakeScale(factor, factor, 1.0f); transform = CATransform3DRotate(transform, acos(-1.0f)*rand()/(CGFloat)RAND_MAX, rand()/(CGFloat)RAND_MAX, rand()/(CGFloat)RAND_MAX, rand()/(CGFloat)RAND_MAX); layer.transform = transform; [CATransaction commit]; }

When you change properties of CALayer objects, the layer will automatically animate to the new values. So if you entered this in a program

layer.position = CGPointMake(50.0, 50.0);

the layer would fly to the new position. This is known as implicit animation. But what we have above inchangeDestination is an example of explicit animation. When you use explicit animation, you use a CATransactionto group together a series of property changes, and set properties of the animation. A transaction begins with a call to the begin class method

[CATransaction begin];

and ends when the commit method is invoked.

[CATransaction commit];

In between you can set properties for the layers involved. The changeDestination method sets the position of the layer, and its transform property.

CALayer *layer = hostView.layer.sublayers.lastObject; layer.position = CGPointMake(hostView.bounds.size.width * rand()/(CGFloat)RAND_MAX, hostView.bounds.size.height * rand()/(CGFloat)RAND_MAX); ... layer.transform = transform;

The transform has the CATransform3D type, and involves a random scaling of the layer, and a random rotation.

CGFloat factor = rand()/(CGFloat)RAND_MAX * 2.0f; CATransform3D transform = CATransform3DMakeScale(factor, factor, 1.0f); transform = CATransform3DRotate(transform, acos(-1.0f)*rand()/(CGFloat)RAND_MAX, rand()/(CGFloat)RAND_MAX, rand()/(CGFloat)RAND_MAX, rand()/(CGFloat)RAND_MAX);

The CATransform3DMakeScale function creates a transform that has scaling factor arguments for x, y, and z. TheCATransform3DRotate function applies a rotation to the transform passed in as first argument; the rotation is through an angle (in radians) passed as the second argument, around a vector (x, y, z) passed as the last three arguments.

The duration of the animation will be 0.25 seconds by default, but this can be changed by setting a value of the CATransaction class

[CATransaction setValue:[NSNumber numberWithFloat:3.0f] forKey:kCATransactionAnimationDuration];

Note that the animation has been set to take 3.0 seconds, but the timer that changes the destination fires every 2.0 seconds. In other words, the animation will not be able to complete before the timer starts a new animation. Can Core Animation cope with this? No problem.

Running Flea on Flea

If you downloaded the Flea on Flea Xcode project, build and run it. You should see a white square dancing across the screen, rotating and scaling as it goes. Note how it moves for 2 seconds, then changes direction. Core Animation interrupts any existing animations, and smoothly modifies the motion of the square to accommodate the new destination. Try changing the duration of the animation and the timer in the source code, to see what effect it has on the way the square moves.

Further Reading

Next time we will make Flea on Flea live up to its name, by adding layers, on layers, on… Until then, you can read more about Core Animation in the Core Animation Programming Guide, and in a new book by Bill Dudney which is still in beta at The Pragmatic Programmers.

Posted by 오늘마감
[아이폰 앱 개발] Cocoa for Scientists (Part XXV): Core Animation Layer Trees

Cocoa for Scientists (Part XXV): Core Animation Layer Trees

Author: Drew McCormack
Web Site: www.maccoremac.com

In the first part of this foray into Core Animation, we saw how you can animate the properties of a plain square layer, translating, scaling, and rotating in time. In this part, you’ll learn how to build up hierarchies of layers — the proverbial fleas on the fleas. To do this, we’ll finish off the Flea on Flea application, adding sublayers that creep and crawl on the backs of their superlayers, which in turn creep and crawl on the backs of theirs.

Flea on Flea

The Flea on Flea application (requires Mac OS X 10.5) is nothing more than an animated scene, but it demonstrates how you build up a layer tree: a hierarchy of child layers and their parents. Each flea layer in the application has other, smaller flea layers randomly walking its surface. You can’t interact with the scene, but you can resize the window to give the fleas more room to move.


The Flea on Flea application.

Flea on Flea is a fun way to be introduced to the layer tree, but you don’t have to think too hard to come up with scientific applications which utilize the same basic approach. Multi-scale is a trendy catch phrase in many fields, and with a bit of creativity, it should be possible to come up with some revolutionary new ways of navigating hierarchical data sets with Core Animation layer trees.

The Layer Tree

Much like NSViewCALayers can be organized into a tree structure, with sublayers in superlayers, in super-superlayers, and so on. Just as for NSView, each layer moves in the coordinate system of its superlayer, which is given by the bounds property. The size and location of a layer relative to its superlayer is determined by itsbounds rectangle, and its position, which is a point giving the location of the anchor point in the superlayer’s coordinate system. The anchor point is the point which represents the center of the layer’s world, which remains fixed during transformations like rotation and scaling. By default, the anchor point is at the center of the layer, but it can be moved by setting the anchorPoint property.



The geometry of a CALayer (image from Core Animation Programming Guide).

There are many similarities between NSView and CALayer, but also some subtle differences. For example, CALayerdoes not crop its sublayers drawing by default. If you want this behavior, you need to set the layer’s propertymasksToBounds to YES. Layers also have a few tricks that views don’t, such as the ability to set a background color (backgroundColor property); round corners (cornerRadius property); draw a drop shadow (shadowOpacity,shadowOffsetshadowRadius, and shadowColor); and include a border (borderWidth and borderColor). You can even set Core Image filters to change the appearance of the layer’s content, or whatever lies directly behind it (eg. you could blur the background).



Flea on Flea with random background colors and rounded corners.

Growing the Tree

The Flea on Flea source code is not much more advanced than it ended up last time. The awakeFromNib method downloads an image of a flea, and stores it in an instance variable.

-(void)awakeFromNib { // Download flea image NSURL *url = [NSURL URLWithString:@"http://medent.usyd.edu.au/fact/flea.gif"]; CGImageSourceRef imageSource = CGImageSourceCreateWithURL((CFURLRef)url, NULL); fleaImage = CGImageSourceCreateImageAtIndex(imageSource, 0, NULL); CFRelease(imageSource); // Setup layers [self setupHostView]; [self createFleas]; // Start Timer [NSTimer scheduledTimerWithTimeInterval:3.0 target:self selector:@selector(changeDestinations) userInfo:nil repeats:YES]; }

As usual, Core Graphics types and functions are used; in this case, a CGImageSource is used to download an image from a URL, and create a CGImage, which is the image type used in Core Animation.

The awakeFromNib method continues by calling a method to setup the host view, which is unchanged from last time, and to initialize the flea layers and build the layer tree.

The createFleas method creates an array to store references to all of the flea layers, and then calls another routine to do the hard work.

-(void)createFleas { allFleaLayers = [NSMutableArray array]; [self createFleasInLayer:hostView.layer atDepth:0]; }

createFleasInLayer:atDepth: is a function that is designed to be called recursively. It adds a new sublayer to the layer passed in, and then recursively calls itself to add its own sublayers, up to a maximum layer depth.

-(void)createFleasInLayer:(CALayer *)layer atDepth:(NSUInteger)depth { static const NSUInteger MAX_DEPTH = 2; static const NSUInteger NUMBER_FLEAS_PER_LAYER = 4; if ( depth > MAX_DEPTH ) return; NSUInteger fleaIndex; for (fleaIndex = 0; fleaIndex < NUMBER_FLEAS_PER_LAYER; ++fleaIndex) { CALayer *newLayer = [self createFleaLayerInLayer:layer]; [allFleaLayers addObject:newLayer]; [self createFleasInLayer:newLayer atDepth:depth+1]; } }

Each flea layer is initialized in the createFleaLayerInLayer: method.

-(CALayer *)createFleaLayerInLayer:(CALayer *)parentLayer { CALayer *layer = [CALayer layer]; // Choose random location in parent layer float rand1 = rand()/(float)RAND_MAX; float rand2 = rand()/(float)RAND_MAX; float parentWidth = CGRectGetWidth(parentLayer.bounds); float parentHeight = CGRectGetHeight(parentLayer.bounds); layer.position = CGPointMake(rand1*parentWidth, rand2*parentHeight); layer.bounds = CGRectMake(0.0, 0.0, parentWidth*0.4, parentHeight*0.4); layer.opacity = 0.9; // Set image layer.contents = (id)fleaImage; // Add to parent [parentLayer addSublayer:layer]; return layer; }

The layer is given a random position in the bounds of its superlayer. Its contents property is set to theCGImageRef created in awakeFromNib, and the layer opacity set to 0.9 (to give it that icky translucence popular with small insects).

That’s almost all there is to it. The only aspect remaining is the animation itself. A timer, which is created inawakeFromNib, fires every 3 seconds, and invokes the changeDestinations method. This method loops over all flea layers, assigning a random position and rotation around the z-axis.

-(void)changeDestinations { [CATransaction begin]; [CATransaction setValue:[NSNumber numberWithFloat:3.0f] forKey:kCATransactionAnimationDuration]; for ( CALayer *layer in allFleaLayers ) { // Choose a random position in the superlayer layer.position = CGPointMake(layer.superlayer.bounds.size.width * rand()/(CGFloat)RAND_MAX, layer.superlayer.bounds.size.height * rand()/(CGFloat)RAND_MAX); // Choose a random rotation around the z axis CATransform3D transform = CATransform3DIdentity; transform = CATransform3DRotate(transform, acos(-1.0)*rand()/(CGFloat)RAND_MAX, 0.0, 0.0, 1.0); layer.transform = transform; } [CATransaction commit]; }

Being Core Animation, these changes don’t take effect immediately, but provide a target for the animation. The code to setup and commit the CATransaction is exactly the same as last time.

Things to Try

Flea on Flea is a fun application to play with. You can quite easily modify aspects of it to change the appearance and behavior of the flea layers. For example, if you want to add a colored drop shadow to the fleas, you could change the createFleasInLayer:atDepth: method as follows:

-(void)createFleasInLayer:(CALayer *)layer atDepth:(NSUInteger)depth { static const NSUInteger MAX_DEPTH = 2; static const NSUInteger NUMBER_FLEAS_PER_LAYER = 4; if ( depth > MAX_DEPTH ) return; NSUInteger fleaIndex; for (fleaIndex = 0; fleaIndex < NUMBER_FLEAS_PER_LAYER; ++fleaIndex) { CALayer *newLayer = [self createFleaLayerInLayer:layer]; // Set the shadow color based on depth CGColorSpaceRef colorSpace = CGColorSpaceCreateWithName(kCGColorSpaceGenericRGB); CGFloat components[4] = {0.0f, 0.0f, 0.0f, 1.0f}; switch (depth) { case 1: components[1] = 1.0f; break; case 2: components[2] = 1.0f; break; case 3: components[0] = 1.0f; components[1] = 1.0f; components[2] = 1.0f; break; } CGColorRef color = CGColorCreate(colorSpace, components); newLayer.shadowColor = color; newLayer.shadowOpacity = 0.5; newLayer.shadowRadius = 2.0; newLayer.shadowOffset = CGSizeMake(0.0, 0.0); CGColorRelease(color); CGColorSpaceRelease(colorSpace); [allFleaLayers addObject:newLayer]; [self createFleasInLayer:newLayer atDepth:depth+1]; } }

It’s also worth observing the effect of setting the masksToBounds property of new layers to YES, as this is a one line change.



Flea on Flea with a Core Image Bloom filter applied to each layer.

Lastly, there are a multitude of Core Image filters that can be applied. Here’s how you can modifycreateFleaLayerInLayer: to apply a bloom effect to the fleas.

-(CALayer *)createFleaLayerInLayer:(CALayer *)parentLayer { CALayer *layer = [CALayer layer]; // Choose random location in parent layer float rand1 = rand()/(float)RAND_MAX; float rand2 = rand()/(float)RAND_MAX; float parentWidth = CGRectGetWidth(parentLayer.bounds); float parentHeight = CGRectGetHeight(parentLayer.bounds); layer.position = CGPointMake(rand1*parentWidth, rand2*parentHeight); layer.bounds = CGRectMake(0.0, 0.0, parentWidth*0.4, parentHeight*0.4); layer.opacity = 0.9; // Set image layer.contents = (id)fleaImage; // Set Core Image filter CIFilter *bloomFilter = [CIFilter filterWithName:@"bloom"]; if ( nil == bloomFilter ) { bloomFilter = [CIFilter filterWithName:@"CIBloom" keysAndValues: kCIInputRadiusKey, [NSNumber numberWithFloat:1.0], kCIInputIntensityKey, [NSNumber numberWithFloat:10.0], nil]; bloomFilter.name = @"bloom"; } layer.filters = [NSArray arrayWithObject:bloomFilter]; // Add to parent [parentLayer addSublayer:layer]; return layer; }

Be warned: filters and drop shadows seem to be demanding on the graphics hardware, and you might want to reduce the MAX_DEPTH and/or NUMBER_FLEAS_PER_LAYER constants before using them.

http://www.macresearch.org/cocoa-scientists-part-xxv-core-animation-layer-trees

Posted by 오늘마감
[아이폰 앱 개발] iPad for iPhone Developers 101: UIPopoverController Tutorial

iPad for iPhone Developers 101: UIPopoverController Tutorial

http://www.raywenderlich.com/1056/ipad-for-iphone-developers-101-uipopovercontroller-tutorial

This is the second part of a three part series to help get iPhone Developers up-to-speed with iPad development by focusing on three of the most interesting new classes (at least to me): UISplitView, UIPopoverController, and Custom Input Views.

In the first part of the series, we made an app with a split view that displays a list of monsters on the left side, and details on the selected monster on the right side.

In this part, we’re going to try out popovers view with a simple example: we’ll add a popover to let the user select from a list of colors to change the color of the monster’s name.

We’ll start out with where we left off the project last time, so grab a copy if you don’t have it already.

Creating our Color Picker

Let’s start by creating the view that we’ll use to let the user pick between a list of colors. We’ll make this a simple table view with a list of color names.

So go to “File\New File…”, pick “UIViewController subclass”, and make sure “Targeted for iPad” and “UITableViewController subclass” are checked but “With XIB for user interface” is NOT checked, and click Next. Name the class ColorPickerController, and click Finish.

Then replace ColorPickerController.h with the following:


@protocol ColorPickerDelegate
- (void)colorSelected:(NSString *)color;
@end
 
 
@interface ColorPickerController : UITableViewController {
    NSMutableArray *_colors;
    id<ColorPickerDelegate> _delegate;
}
 
@property (nonatomic, retain) NSMutableArray *colors;
@property (nonatomic, assign) id<ColorPickerDelegate> delegate;
 
@end

Here we declare a delegate so that this class can notify another class when a user selects a color. We then declare two variables/properties: one for the list of colors to display, and one to store the delegate itself.

Then make the following changes to ColorPickerController.m:


// Under @implementation
@synthesize colors = _colors;
@synthesize delegate = _delegate;
 
// Add viewDidLoad like the following:
- (void)viewDidLoad {
    [super viewDidLoad];
    self.clearsSelectionOnViewWillAppear = NO;
    self.contentSizeForViewInPopover = CGSizeMake(150.0, 140.0);
    self.colors = [NSMutableArray array];
    [_colors addObject:@"Red"];
    [_colors addObject:@"Green"];
    [_colors addObject:@"Blue"];
}
 
// in numberOfSectionsInTableView:
return 1;
 
// in numberOfRowsInSection:
return [_colors count];
 
// In cellForRowAtIndexPath, under configure the cell:
NSString *color = [_colors objectAtIndex:indexPath.row];
cell.textLabel.text = color;
 
// In didSelectRowAtIndexPath:
if (_delegate != nil) {
    NSString *color = [_colors objectAtIndex:indexPath.row];
    [_delegate colorSelected:color];
}
 
// In dealloc
self.colors = nil;
self.delegate = nil;

Most of this should be normal table view stuff except for the following line:


self.contentSizeForViewInPopover = CGSizeMake(150.0, 140.0);

This line sets the size of how large the popover should be when it is displayed. If you do not add this line, by default the popover will be the entire height of the screen (which is usually too large).

Displaying the Picker

Believe it or not, that was the hardest part. Now to display the picker, all we need to do is add a button to our toolbar, and a little bit of code to display it and handle the selection.

So first, let’s add the button. Open up RightViewController.xib and add a Bar Button Item to the toolbar. Set the title of the button “Set Color”.

Now let’s declare a method for the button to trigger in RightViewController.h and declare a few variables we’ll need in a minute:


// Up top, under #import
#import "ColorPickerController.h"
 
// Modfiy class declaration
@interface RightViewController : UIViewController <MonsterSelectionDelegate,  
    UISplitViewControllerDelegate, ColorPickerDelegate> {
 
// Inside class
ColorPickerController *_colorPicker;
UIPopoverController *_colorPickerPopover;
 
// In property section
@property (nonatomic, retain) ColorPickerController *colorPicker;
@property (nonatomic, retain) UIPopoverController *colorPickerPopover;
 
- (IBAction)setColorButtonTapped:(id)sender;

Before we forget, go ahead and connect the action method to the Bar Button Item in Interface Builder by control-dragging from the Bar Button Item to File’s Owner and connecting to the “setColorButtonTapped” outlet.

Then let’s finish by making the required changes to RightViewController.m:


// In synthesize section
@synthesize colorPicker = _colorPicker;
@synthesize colorPickerPopover = _colorPickerPopover;
 
// In dealloc
self.colorPicker = nil;
self.colorPickerPopover = nil;
 
// Add to end of file
- (void)colorSelected:(NSString *)color {
    if ([color compare:@"Red"] == NSOrderedSame) {
        _nameLabel.textColor = [UIColor redColor];
    } else if ([color compare:@"Green"] == NSOrderedSame) {
        _nameLabel.textColor = [UIColor greenColor];
    } else if ([color compare:@"Blue"] == NSOrderedSame){
        _nameLabel.textColor = [UIColor blueColor];
    }
    [self.colorPickerPopover dismissPopoverAnimated:YES];
}
 
- (IBAction)setColorButtonTapped:(id)sender {
    if (_colorPicker == nil) {
        self.colorPicker = [[[ColorPickerController alloc] 
            initWithStyle:UITableViewStylePlain] autorelease];
        _colorPicker.delegate = self;
        self.colorPickerPopover = [[[UIPopoverController alloc] 
            initWithContentViewController:_colorPicker] autorelease];               
    }
    [self.colorPickerPopover presentPopoverFromBarButtonItem:sender 
        permittedArrowDirections:UIPopoverArrowDirectionAny animated:YES];
}

Ok let’s explain this a bit. All popovers are is a “wrapper” around an existing view controller that “floats” it in a certain spot and possibly displays an arrow showing what the popover is related to. You can see this in setColorButtonTapped – we create our color picker, and then wrap it with a popover controller.

Then we call a method on the popover controller to display it in the view. We use the helper function presentPopoverFromBarButtonItem to display the popover.

When the user is done, they can tap anywhere outside the popover to dismiss it automatically. However if they select a color, we also want it to be dismissed, so we call the dismissPopoverAnimated method to get rid of the popover on-demand (as well as setting the color appropriately).

And that’s it! Compile and run and when you tap the “Set Color” bar button item, you should see a popover like the following that changes the label color:

You will find yourself using popovers quite a bit in places where users need to edit a field or toggle a setting, rather than the iPhone style where you navigate to the next level in a UINavigationController. They call this “flattening the hierarchy” in the iPad docs.

Show Me the Code!

Here’s a copy of all of the code we’ve developed so far.

Check out the next part of the series, where we cover how to use custom input views on the iPad!

Posted by 오늘마감
iPad for iPhone Developers 101: UIPopoverController Tutorial

iPad for iPhone Developers 101: UIPopoverController Tutorial

http://www.raywenderlich.com/1056/ipad-for-iphone-developers-101-uipopovercontroller-tutorial

This is the second part of a three part series to help get iPhone Developers up-to-speed with iPad development by focusing on three of the most interesting new classes (at least to me): UISplitView, UIPopoverController, and Custom Input Views.

In the first part of the series, we made an app with a split view that displays a list of monsters on the left side, and details on the selected monster on the right side.

In this part, we’re going to try out popovers view with a simple example: we’ll add a popover to let the user select from a list of colors to change the color of the monster’s name.

We’ll start out with where we left off the project last time, so grab a copy if you don’t have it already.

Creating our Color Picker

Let’s start by creating the view that we’ll use to let the user pick between a list of colors. We’ll make this a simple table view with a list of color names.

So go to “File\New File…”, pick “UIViewController subclass”, and make sure “Targeted for iPad” and “UITableViewController subclass” are checked but “With XIB for user interface” is NOT checked, and click Next. Name the class ColorPickerController, and click Finish.

Then replace ColorPickerController.h with the following:


@protocol ColorPickerDelegate - (void)colorSelected:(NSString *)color; @end     @interface ColorPickerController : UITableViewController { NSMutableArray *_colors; id<ColorPickerDelegate> _delegate; }   @property (nonatomic, retain) NSMutableArray *colors; @property (nonatomic, assign) id<ColorPickerDelegate> delegate;   @end

Here we declare a delegate so that this class can notify another class when a user selects a color. We then declare two variables/properties: one for the list of colors to display, and one to store the delegate itself.

Then make the following changes to ColorPickerController.m:


// Under @implementation @synthesize colors = _colors; @synthesize delegate = _delegate;   // Add viewDidLoad like the following: - (void)viewDidLoad { [super viewDidLoad]; self.clearsSelectionOnViewWillAppear = NO; self.contentSizeForViewInPopover = CGSizeMake(150.0, 140.0); self.colors = [NSMutableArray array]; [_colors addObject:@"Red"]; [_colors addObject:@"Green"]; [_colors addObject:@"Blue"]; }   // in numberOfSectionsInTableView: return 1;   // in numberOfRowsInSection: return [_colors count];   // In cellForRowAtIndexPath, under configure the cell: NSString *color = [_colors objectAtIndex:indexPath.row]; cell.textLabel.text = color;   // In didSelectRowAtIndexPath: if (_delegate != nil) { NSString *color = [_colors objectAtIndex:indexPath.row]; [_delegate colorSelected:color]; }   // In dealloc self.colors = nil; self.delegate = nil;

Most of this should be normal table view stuff except for the following line:


self.contentSizeForViewInPopover = CGSizeMake(150.0, 140.0);

This line sets the size of how large the popover should be when it is displayed. If you do not add this line, by default the popover will be the entire height of the screen (which is usually too large).

Displaying the Picker

Believe it or not, that was the hardest part. Now to display the picker, all we need to do is add a button to our toolbar, and a little bit of code to display it and handle the selection.

So first, let’s add the button. Open up RightViewController.xib and add a Bar Button Item to the toolbar. Set the title of the button “Set Color”.

Now let’s declare a method for the button to trigger in RightViewController.h and declare a few variables we’ll need in a minute:


// Up top, under #import #import "ColorPickerController.h"   // Modfiy class declaration @interface RightViewController : UIViewController <MonsterSelectionDelegate, UISplitViewControllerDelegate, ColorPickerDelegate> {   // Inside class ColorPickerController *_colorPicker; UIPopoverController *_colorPickerPopover;   // In property section @property (nonatomic, retain) ColorPickerController *colorPicker; @property (nonatomic, retain) UIPopoverController *colorPickerPopover;   - (IBAction)setColorButtonTapped:(id)sender;

Before we forget, go ahead and connect the action method to the Bar Button Item in Interface Builder by control-dragging from the Bar Button Item to File’s Owner and connecting to the “setColorButtonTapped” outlet.

Then let’s finish by making the required changes to RightViewController.m:


// In synthesize section @synthesize colorPicker = _colorPicker; @synthesize colorPickerPopover = _colorPickerPopover;   // In dealloc self.colorPicker = nil; self.colorPickerPopover = nil;   // Add to end of file - (void)colorSelected:(NSString *)color { if ([color compare:@"Red"] == NSOrderedSame) { _nameLabel.textColor = [UIColor redColor]; } else if ([color compare:@"Green"] == NSOrderedSame) { _nameLabel.textColor = [UIColor greenColor]; } else if ([color compare:@"Blue"] == NSOrderedSame){ _nameLabel.textColor = [UIColor blueColor]; } [self.colorPickerPopover dismissPopoverAnimated:YES]; }   - (IBAction)setColorButtonTapped:(id)sender { if (_colorPicker == nil) { self.colorPicker = [[[ColorPickerController alloc] initWithStyle:UITableViewStylePlain] autorelease]; _colorPicker.delegate = self; self.colorPickerPopover = [[[UIPopoverController alloc] initWithContentViewController:_colorPicker] autorelease]; } [self.colorPickerPopover presentPopoverFromBarButtonItem:sender permittedArrowDirections:UIPopoverArrowDirectionAny animated:YES]; }

Ok let’s explain this a bit. All popovers are is a “wrapper” around an existing view controller that “floats” it in a certain spot and possibly displays an arrow showing what the popover is related to. You can see this in setColorButtonTapped – we create our color picker, and then wrap it with a popover controller.

Then we call a method on the popover controller to display it in the view. We use the helper function presentPopoverFromBarButtonItem to display the popover.

When the user is done, they can tap anywhere outside the popover to dismiss it automatically. However if they select a color, we also want it to be dismissed, so we call the dismissPopoverAnimated method to get rid of the popover on-demand (as well as setting the color appropriately).

And that’s it! Compile and run and when you tap the “Set Color” bar button item, you should see a popover like the following that changes the label color:

You will find yourself using popovers quite a bit in places where users need to edit a field or toggle a setting, rather than the iPhone style where you navigate to the next level in a UINavigationController. They call this “flattening the hierarchy” in the iPad docs.

Show Me the Code!

Here’s a copy of all of the code we’ve developed so far.

Check out the next part of the series, where we cover how to use custom input views on the iPad!

Posted by 오늘마감
XCODE2010.06.25 07:44
IPhone Toolchain For Linux
오픈 소스 툴 체인 빌드

아마 여러분은 DRM을 내장하여 링크한 애플리케이션을 빌드 하는 것에 반대 의견을 가지고 있거나, 아마도 단지 오픈 소스 소프트웨어를 사용하는 것을 좋아할 것이다. 오픈 소스 툴 체인은 2007년 8월 이래로 존재하였으며 그 동안 무시할 수 없을 만큼 자라나고 있다. 또한 리눅스, 윈도, 오래된 버전의 맥 OS X에서 돌아가게 되었으며, 개발 부서 전체를 레오파드 기반의 맥으로 변환하고 싶지 않을 때 개발자들이 선택할 수 있는 훨씬 저렴한 방법이다.

다수의 오픈 소스 툴 체인 바이너리 배포판이 존재한다. 우리는 아이폰 개발 플랫폼으로 사용하기 위해 단순히 레오파드가 설치된 상태에서 따라가 보겠다.

필요한 것들

인터넷 상에 툴 체인의 비공식 바이너리 배포판이 돌아다니고 있지만 여기서는 소스로부터 직접 빌드할 것이다. 소스로부터 빌드 하기 위해서 아래에 나열된 것들이 필요하다.

지원되는 데스크톱 플랫폼

첫째로 여러분은 지원되는 데스크톱 플랫폼이 필요하다. 현재 툴 체인에서 지원되는 플랫폼들은 다음과 같다.
  • Mac OS X 10.4 Intel or PPC
  • Mac OS X 10.5 Intel
  • Ubuntu Feisty Fawn, Intel
  • Ubuntu Gusty Gibbon, Intel
  • Fedora Core, Intel
  • Gentoo Linux 2007.0, x86_64
  • Debian 2.6.18
  • CentOS 4
  • Windows / Cygwin
  • Conceited Software의 니콜라스 팬리가 레오파드에서 툴 체인 설치가 실행되도록 노력했으며, 우리의 예제에서 이 방법에 나만의 아이폰 OS 지원을 위한 메모들을 덧붙여 사용할 것이다. 다른 플렛폼들도 기본적으로 같은 절차를 따르면 된다. 툴 체인의 공식 설명서는 다음 위치에 있다. http://code.google.com/p/iphone-dev/wiki/Building.

    고속의 인터넷 환경

    이 툴 체인은 그 소스만으로 크기가 수백 메가에 달한다. 고속의 인터넷을 통해 다운로드 하지 않는다면 몇 일 동안 다운로드 해야 될 것이다. 만약 여러분의 인터넷이 그렇게 빠르지 않다면, 아마도 도서관이나 커피숍에서 설치하는 것이 좋을 것이다. [역자 주: 우리나라에선 도서관, 커피숍보다 –충분히 빠른- 집에서 하는데 좋을 것 같습니다]

    오픈 소스 툴

    다음으로 필요한 것은, 여러분의 데스크톱에 다음의 오픈 소스 툴들이 설치되어야 한다.
    • bison (v1.28 or later)
    • flex (v2.5.4 or later)
    • gcc (the GNU compiler that handles C, C++, and Objective-C )
    • svn (the Subversion source control utity)
    • 이들 중 빠진 게 있다면 계속 따라 가기 전에 빠진 것을 다운로드하고 설치해야 한다. 맥에서는 이것들이 Xcode tools에 포함되어 있으므로 최신 버전의 Xcode로 업그레이드 하면 된다. 대부분의 다른 OS들은 이것들을 배포 판 안에서 선택적으로 설치할 수 있도록 제공한다.

      Xcode tools 는 애플 웹 사이트 http://developer.apple.com/tools/xcode/ 에서 다운로드 할 수 있다.

      애플 SDK

      마지막으로 아이폰 OS 라이브러리와 프레임워크를 사용하기 위해 애플 SDK를 복사해야 한다. 여러분의 애플리케이션이 빌드될 때 이것들과 링크될 것이다.

      이미 애플 SDK가 설치되어 있다면 그 프레임워크 라이브러리들을 툴 체인이 볼 수 있는 곳으로 심볼릭 링크만 하면 된다.
      sudo mkdir -p /usr/local/share sudo ln -s/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS2.0.sdk /usr/local/share/iphone-filesystem 
      만약 여려분이 이 라이브러리들을 데스크톱에 설치할 수 없다면 jailbroken 1.2/2.0 파일 시스템이 돌고 있는 아이폰으로부터 그 라이브러리와 프레임워크들만 직접 다운로드 할 수도 있다.

      애플은 아이폰에서 데스트톱으로 라이브러리를 복사하는 것을 허용하지 않는다. 여러분이 거주하는 곳에서 이 방법이 합법적인지 확인할 필요가 있다.

      여러분의 아이폰에 SSH를 설치했다고 가정하고, 다음의 명령어를 사용해 파일들을 /usr/local/share/iphone-filesystem 폴더로 복사한다.
      # mkdir -p /usr/local/share/iphone-filesystem # cd /usr/local/share/iphone-filesystem # mkdir -p ./System/library ./usr # scp -r root@[IPHONE IP ADDRESS]:/System/library/Frameworks/ . # mv Frameworks ./System/library # scp -r root@iphone:/usr/lib . # mv lib ./usr 
      툴 체인 컴파일

      툴 체인에 필요한 소스코드는 두 저장소에 나뉘어 있다. 하나는 LLVM 컴파일러 프레임워크고 다른 하나는 나머지 툴 체인이다. 여러분이 사용할 빌드 디렉토리를 만들고 cd 명령어로 그 디렉토리로 이동한다. 이제 서브버젼을 사용하여 두 프로젝트들을 각각의 저장소로부터 체크아웃한다.
      $ svn co http://llvm.org/svn/llvm-project/llvm/trunk llvm-svn -r 42498 $ svn co http://iphone-dev.googlecode.com/svn/trunk/ iphone-dev 
      또한 여로 조각의 새 버전의 아이폰 OS를 가져오기 위해 두개의 SVN 저장소를 스위치 아웃 해야 한다. 이렇게 하면, 아이폰 OS가 공식 릴리즈 되면 결과적으로 이를 소스트리의 루트로 머지할 것이다.
      $ pushd include $ svn switch http://iphone-dev.googlecode.com/svn/branches/include-1.2-sdk $ popd $ pushd odcctools $ svn switch http://iphone-dev.googlecode.com/svn/branches/odcctools-9.2-ld $ popd 
      인터넷선의 속도에 따라 다운로드는 한 시간 이상이 걸릴 수 있다. 양쪽 저장소가 모두 체크아웃 되면 이제 빌드를 시작할 차례다.

      아마 여러분이 내장 쉘 명령어인 pushd와 popd의 사용이 친숙하지 않을지도 모르겠다. 이것들은 cd 명령어와 비슷하지만 추가로 디렉터리를 스택에 푸쉬, 팝 한다. 이로 인해 새 디렉터리에서 뭔가를 한 뒤 이전 디렉터리로 돌아와야 할 때 이전 디렉터리가 어디였는지 기억하지 않고도 돌아올 수 있다.

      또한 sudo 명령어가 사용된 것을 볼 수 있는데, 이것은 명령어를 특별한(root)권한으로 실행하는 유닉스 툴 이다. 특정 명령어를 root권한으로 실행하려 한다면 (이 방법은 운영체계의 민감한 데이터에 접근하고 운영체제를 망가트리는 위험한 행동이 될 수 있기 있다) 명령어 앞에 sudo를 붙여 실행하면 된다. 맥 OS X 에서는 그 명령어를 실행하기 전에 root 암호를 물어볼 것이다. 만약 sudo가 없다면 명령어들중 sudo만 제외하고 실행하면 되는데, 먼저 su 실행해 루트 권한을 획득해야 된다.

      1단계: LLVM 프레임워크 빌드와 설치

      LLVM (Low Level Virtual Machine) 프레임워크는 컴파일러를 빌드하기 위한 표준 환경(infrastructure)를 제공한다. LLVM은 기초적인 컴포넌트들을 모두 다시 작성할 필요 없이 표준화된 컴파일러를 빌드하기 위해 필요한 훅(hook)과 API들을 제공한다. 다음의 명령어들을 사용해 llvm 컴파일러의 릴리즈 빌드를 컴파일 하고 설치한다.
      $ pushd llvm-svn $ ./configure --enable-optimized $ make ENABLE_OPTIMIZED=1 $ sudo make install $ LLVMOBJDIR=`pwd` $ popd 
      2단계: 크로스-컴파일-툴 빌드와 설치

      다음의 명령어들을 따라 크로스-컴파일러를 빌드하고 설치한다. 아래의 내용은 Mac OS X 용으로 여러분이 다른 플렛폼을 사용한다면 공식 문서를 읽어 이를 변경해야 한다.
      $ pushd iphone-dev $ sudo mkdir /usr/local/arm-apple-darwin $ mkdir -p build/odcctools $ pushd build/odcctools $ ../../odcctools/configure --target=arm-apple-darwin --disable-ld64 $ export INCPRIVEXT="-isysroot /Developer/SDKs/MacOSX10.4u.sdk" $ make $ sudo make install $ popd $ HEAVENLY=/usr/local/share/iphone-filesystem 
      3단계: 저-수준 API 헤더 설치

      아이폰의 아키텍쳐가 데스크톱과 다르기 때문에 아이폰의 저-수준 API들에 접근하기 위해서 특별한 헤더들을 설치해야 한다. 다음의 명령어들을 사용해 이를 설치한다.
      $ pushd include $ ./configure --with-macosx-sdk=/Developer/SDKs/MacOSX10.4u.sdk $ sudo bash install-headers.sh $ popd 
      4단계: Csu 설치

      Csu는 어셈블리의 "시작" 진입 점에 C 훅을 제공하고, 여러분의 프로그램의 main() 함수가 호출되도록 스택을 구성한다. 이는 꼭 필요한 연결(glue) 코드이다.
      $ mkdir -p build/csu $ pushd build/csu $ ../../csu/configure --host=arm-apple-darwin $ sudo make install $ popd 
      5단계: llvm-gcc의 빌드와 설치

      이제 LLVM, 크로스 컴파일 툴, 그리고 Csu가 빌드 되었으므로 컴파일러 자체를 빌드하고 설치한다. 여기서부터 따라가기 시작했거나 터미널 창을 닫은 적이 있다면 환경변수 $LLVMOBJDIR, $HEAVENLY 가 적적한 디렉터리들로 설정되어 있는지 확인해야 한다. 환경 변수 LLVMOBJDIR는 LLVM을 빌드했을 때 컴파일 된LLVM 오브젝트 파일들의 위치로 설정되어 있어야 한다. 이 값들은 llvm-gcc를 빌들할 때 사용된다. 환경 변수 HEAVENLY는 아이폰의 라이브러리를 데스크톱으로 카피한 위치로 설정되어야 한다. 이 위치는 아이폰의 라이브러리를 데스크탑 어디에 복사했는지에 따라 다르다. 이 디렉터리는 애플리케이션을 컴파일할 때 llvm-gcc에서 프레임워크와 라이브러리에 링크할 때 사용된다. "Heavenly"라는 이름은 애플에서 아이폰 소프트웨어의 1.0 코드 베이스에 붙인 코드명이다. 최근 버전(1.2)의 이름은 “Aspen”이지만 툴 체인 에서는 원래의 이름이 계속 사용되고 있다. 두 이름 모두 스키 슬로프의 이름에서 비롯되었다.
      $ set | grep -e LLVMOBJDIR -e HEAVENLY 
      만약 위의 명령어로부터 아무런 출력을 볼 수 없다면 환경변수를 다시 설정하여야 한다. 빌드 디렉터리로 돌아가 다음 명령어를 실행한다.
      $ pushd llvm-svn && LLVMOBJDIR=`pwd` && popd $ HEAVENLY=/usr/local/share/iphone-filesystem 
      이것들이 모두 준비되었다고 확신하면 컴파일러를 빌드하고 설치하기 위해 다음 명령어들을 입력한다.
      $ mv llvm-gcc-4.0-iphone/configure llvm-gcc-4.0-iphone/configure.old $ sed 's/^FLAGS_FOR_TARGET=$/FLAGS_FOR_TARGET=${FLAGS_FOR_TARGET-}/g' llvm-gcc-4.0-iphone/configure.old > llvm-gcc-4.0-iphone/configure $ sudo ln -s /usr/local/arm-apple-darwin/lib/crt1.o /usr/local/arm-apple-darwin/lib/crt1.10.5.o $ mkdir -p build/llvm-gcc-4.0-iphone $ pushd build/llvm-gcc-4.0-iphone $ export FLAGS_FOR_TARGET="-mmacosx-version-min=10.1" $ sh ../../llvm-gcc-4.0-iphone/configure --enable-llvm=`llvm-config --obj-root` --enable-languages=c,c++,objc,obj-c++ --target=arm-apple-darwin --enable-sjlj-exceptions --with-heavenly=$HEAVENLY --with-as=/usr/local/bin/arm-apple-darwin-as --with-ld=/usr/local/bin/arm-apple-darwin-ld $ make LLVM_VERSION_INFO=2.0-svn-iphone-dev-0.3-svn $ sudo make install $ popd $ popd 
      축하한다! 이제 아이폰을 위한 자유 툴 체인이 설치되었다. 이제 여러분은 아이폰 애플리케이션을 컴파일 할 준비가 되었다. 이 컴파일러는 /usr/local/bin/arm-apple-darwin-gcc 실행하여 바로 실행할 수 있다.

      아이폰 OS용의 애플리케이션을 빌드할 때 추가로 다음 CFLAGS를:
      CFLAGS = -fobjc-abi-version=2 -F/usr/local/share/iphone-filesystem/System/library/PrivateFrameworks 
      그리고 다음의 LDFLAGS를 사용해야 한다.
      -lobjc 
      Resources
      • iPhone Open Application Development -- book by Jonathan Zdziarski
      • iPhone Open Application Development Forum -- Join conversations with author Jonathan Zdziarski, and other developers designing third-party software that will run on the iPhone

      • 출처 : http://network.hanb.co.kr/view.php?bi_id=1530



출처 : http://blog.naver.com/PostView.nhn?blogId=musicnet&logNo=10032583533
Posted by 오늘마감
OpenGL ES Progg Guide for iPhone

Ch1. OpenGL ES on the iPhone

OpenGLES isaclientofCoreAnimation.

YourapplicationcreatesaUIViewclasswithaspecialCoreAnimationlayer,aCAEAGLLayer.

ACAEAGLLayerobjectisawareofOpenGLESandcanbeusedtocreaterenderingtargetsthatactaspartofCoreAnimation.

Whenyourapplicationfinishesrenderingaframe,youpresentthecontentsoftheCAEAGLLayerobject,wheretheywillbecompositedwiththedatafromotherviews.


* categories of functions

- ReadingthecurrentstateofanOpenGLEScontext.

- ChangingstatevariablesinanOpenGLEScontext.

- Creating,modifyingordestroyingOpenGLESobjects.

- Submittinggeometrytoberendered. (rasterized to a framebuffer)


* objects

- texture: image

- buffer: set of memory (vertex data)

- shader

- renderbuffer: normally as a part of a framebuffer

- framebuffer: ultimate destination of the graphics pipeline


*common behavior of objects

- generate an OID: it simply allocates a reference to an object

- bound to OpenGL ES context:The first time you bind toan object identifier,OpenGL ES allocates memory and initializes that object.

- modify the state

- used for rendering

- deleted


OntheiPhone,OpenGLESobjectsaremanagedbyasharegroupobject.


Twoormorerenderingcontextscanbeconfiguredtousethesamesharegroup.


Appledoesnotprovideaplatforminterfaceforcreatingframebufferobjects.Instead,allframebufferobjectsarecreatedusingtheOES_framebuffer_objectextension.


* framebuffer creating procedure

1.Generateandbindaframebufferobject.

2.Generate,bindandconfigureanimage.

3.Attachtheimagetotheframebuffer.

4.Repeatsteps2and3forotherimages.

5.Testtheframebufferforcompleteness.


AllimplementationsofOpenGLESrequiresomeplatformspecificcodetocreatearenderingcontextandtouseittodrawtothescreen.

EAGL: Embedded Apple OpenGL Extension for MacOS X


* EAGLContextclass

- defines rendering context

- target of OpenGL ES commands

- presents images to Core Animation for display


EveryEAGLContextobjectcontainsareferencetoanEAGLSharegroupobject (texture, buffer, framebuffer, shader, ...).


EAGLDrawableprotocol:objectcanbeusedtoallocatestoreforarenderbufferthatcanlater bepresentedbytheuser. ImplementedonlybytheCAEAGLLayerclass.


** OpenGL ES on the iPhone

= OpenGL ES 1.1 (fixed function graphics pipeline)

+ OpenGL ES 2.0 (shader pipeline)


**IfyourapplicationfailstotestthecapabilitiesofOpenGLESatruntime,itmaycrashorfailtorun.


- v1.1: good baseline behavior for 3D graphics pipeline (on all iTouches)

- v2.0: more flexible. custom vertex and fragment operations are implemented trivially. this is not super set of 1.1.


EAGLContext* myContext = [[EAGLContext alloc] initWithAPI:kEAGLRenderingAPIOpenGLES<1|2>];

If a particular implementation of OpenGL ES is not available,initWithAPI: will return nil. Application must test it before using it.


Ch2. Determining OpenGL ES Capabilities

Sincethecapabilitiesofthecontextwillnotchangeonceithasbeencreated,yourapplicationcantestthisonceanddeterminewhichpathitwilluse.


* common: GL_MAX_TEXTURE_SIZE, GL_DEPTH_BITS, GL_STENCIL_BITS

* 1.1: GL_MAX_TEXTURE_UNITS, GL_MAX_CLIP_PLANES

* 2.0: GL_MAX_VERTEX_ATTRIBS, GL_MAX_VERTEX_UNIFORM_VECTORS, GL_MAX_FRAGMENT_UNIFORM_VECTORS, GL_MAX_VARYING_VECTORS, GL_MAX_VERTEX_TEXTURE_IMAGE_UNITS, GL_MAX_TEXTURE_IMAGE_UNITS


BOOL CheckForExtension(NSString *searchName)

{

    NSString *extensionsString = [NSString stringWithCString:glGetString(GL_EXTENSIONS) encoding: NSASCIIStringEncoding];

    NSArray *extensionsNames = [extensionsString componentsSeparatedByString:@" "];

    return [extensionsNames containsObject: searchName];

}


ThedebugversionofyourapplicationshouldcallglGetErroraftereveryOpenGLEScommand.


Ch3. Working with EAGL

// creating EAGL context

EAGLContext* myContext = [[EAGLContext alloc]

initWithAPI:kEAGLRenderingAPIOpenGLES1];

[EAGLContext setCurrentContext: myContext];


// create framebuffer object

1. create framebuffer object

2. create target (renderbuffer or texture), allocate storage for target, attatch it to the framebuffer object

3. test framebuffer for completeness


1. Create the framebuffer and bind it so that future OpenGL ES framebuffer commands are directed to it. 

GLuint framebuffer;

glGenFramebuffersOES(1, &framebuffer);

glBindFramebufferOES(GL_FRAMEBUFFER_OES, framebuffer);

2. Create a color renderbuffer, allocate storage for it, and attach it to the framebuffer. 

GLuint colorRenderbuffer;

glGenRenderbuffersOES(1, &colorRenderbuffer);

glBindRenderbufferOES(GL_RENDERBUFFER_OES, colorRenderbuffer);

glRenderbufferStorageOES(GL_RENDERBUFFER_OES, GL_RGBA8_OES, width, height);

glFramebufferRenderbufferOES(GL_FRAMEBUFFER_OES, GL_COLOR_ATTACHMENT0_OES, GL_RENDERBUFFER_OES, depthRenderbuffer);

3. Perform similar steps to create and attach a depth renderbuffer. 

GLuint depthRenderbuffer;

glGenRenderbuffersOES(1, &depthRenderbuffer);

glBindRenderbufferOES(GL_RENDERBUFFER_OES, depthRenderbuffer);

glRenderbufferStorageOES(GL_RENDERBUFFER_OES, GL_DEPTH_COMPONENT16_OES, width,height);

glFramebufferRenderbufferOES(GL_FRAMEBUFFER_OES,GL_DEPTH_ATTACHMENT_OES, GL_RENDERBUFFER_OES, depthRenderbuffer);

4. Test the framebuffer for completeness. 

GLenum status = glCheckFramebufferStatusOES(GL_FRAMEBUFFER_OES) ;

if(status != GL_FRAMEBUFFER_COMPLETE_OES) {

    NSLog(@"failed to make complete framebuffer object %x", status);

}


If your CAEAGLLayer object must be blended with other layers, you will see a significant performance penalty.You can reduce this penalty by playingy our CAEAGLLayer behind other UIKit layers.


* Sharegroup *

EAGLContext* firstContext = [[EAGLContext alloc]

initWithAPI:kEAGLRenderingAPIOpenGLES1];

EAGLContext* secondContext = [[EAGLContext alloc] initWithAPI:kEAGLRenderingAPIOpenGLES1 sharegroup: [firstContext sharegroup]];


Ch4. Working with Vertex Data

AvoidtheuseoftheOpenGLESGL_FIXEDdatatype.

-glDrawArraystodrawyourgeometry

- useglDrawElementstospecifyindicesforthetrianglesinyourgeometry


If the data did not change, use vertex buffer object (VBO).


Ch5. Working with Texture Data

PowerVR Texture Compression (PRVTC) format by implementing theGL_IMG_texture_compression_pvrtc extension


Future Apple hardware may not support the PVRTC texture format. You must test for the existence of compressed texture.


Createandloadyourtexturedataduringinitialization.


BindingtoatexturechangesOpenGLESstate. Avoid unnecessary change.


Onewaytoavoidchangingthetextureistocombinemultiplesmallertexturesintoasinglelargetexture, knownasatextureatlas.


Yourapplicationshouldprovidemipmapsforalltexturesexceptthosebeingusedtodraw2Dunscaledimages.

GL_LINEAR_MIPMAP_LINEARfiltermodeprovidesthebestquality.GL_LINEAR_MIPMAP_NEARESTfiltermode better performance.


Ch6. Performance

* Redraw Scenes Only when Necessary

: Aslower,butfixedframerate (e.g. 30 fps)oftenappearssmoothertotheuserthanafastbutvariableframerate.


* Use Floating Point Arithmetic

: use ARM instruction set but not Thumb


* Disable Unused OpenGL ES features


* Minimize the Number of Draw Calls

:consolidate geometry that is in close spacialproximity


* Memory

-After loading an image into your OpenGL ES texture using glTexImage, you can free the original image.

-Only allocate a depth buffer when your application requires it.

-If your application does not need allof its resources at once,only load a subset of the totalresources.


* Avoid Querying OpenGL ES State

: Callsto glGet*() including glGetError() may require OpenGL ES to execute allprevious commands before retrieving any state variables.This synchronization forces the graphics hardware to run lockstep withthe CPU,reducing opportunities for parallelism. UseglGetError() only in debug build.


* Avoid Changing OpenGL ES State Unnecessarily


* Drawing order

: Do not waste CPU time sorting objects front to back.

: Sort objects by their opacity (opaque > alpha testing > alpha blended)


App.A. Using Texturetool

ex) Encode Image.png into PVRTC using linear weights and 4 bpp, and saving the output as ImageL4.pvrtc and a PNG preview as ImageL4.png


user$ texturetool -e PVRTC --channel-weighting-linear --bits-per-pixel-4 -o ImageL4.pvrtc -p ImageL4.png Image.png


ex) uploading image

void texImage2DPVRTC(GLint level, GLsizei bpp, GLboolean hasAlpha, GLsizei width, GLsizei height, void *pvrtcData)

{

    GLenum format;

    GLsizei size = width * height * bpp / 8;

    if(hasAlpha) {

        format = (bpp == 4) ? GL_COMPRESSED_RGBA_PVRTC_4BPPV1_IMG :

GL_COMPRESSED_RGBA_PVRTC_2BPPV1_IMG;

    } else {

        format = (bpp == 4) ? GL_COMPRESSED_RGB_PVRTC_4BPPV1_IMG :

GL_COMPRESSED_RGB_PVRTC_2BPPV1_IMG;

    }

    if(size < 32) {

        size = 32;

    }

    glCompressedTexImage2D(GL_TEXTURE_2D, level, format, width, height, 0, size,

data);

}




출처 : http://blog.naver.com/PostView.nhn?blogId=gonagi&logNo=150067227174
Posted by 오늘마감
Memory Management Programming Guide for Cocoa

- garbage collected environment: Mac OS X 10.5+

- reference counted environment: iPhone OS


Object ownership policy: responsibility for disposal


** policy **

- You own any object you create (alloc, new, copy)

- If you own an object, you are responsible for relinquishing ownership when you have finished with it.

- If you do not own an object (convenience constructor), you must not release it.


// typical good example

Thingamajig *thingamajig = [[Thingamajig alloc] init];

// ...

NSArray *sprockets = [thingamajig sprockets];

// ...

[thingamajig release];


// typical wrong example

// 1 - convenience method

Thingamajig *thingamajig = [Thingamajig thingamajig];

[thingamajig release]; // wrong


// 2 - function loses the chance to release

+ (Thingamajig *)thingamajig {

    id newThingamajig = [[Thingamajig alloc] init];

    return newThingamajig;

}


// 3 - no owner, disposed before returned

+ (Thingamajig *)thingamajig {

    id newThingamajig = [[Thingamajig alloc] init];

    [newThingamajig release];

    return newThingamajig; // newThingamajig is invalid here

}


autorelease - you declare that you don't want to own the object beyond the scope in which you sent autorelease.


– (NSArray *)sprockets {

    NSArray *array;

    array = [[NSArray alloc] initWithObjects:mainSprocket,

                               auxiliarySprocket, nil];

    return [array autorelease];

}


strong reference: pointer which retaining the object

weak reference: pointer that doesn't (usually to avoid mutual retaining each other) ex) table data sources,outlineview items,notification observers,and miscellaneous targets and delegates

** holder of weak reference SHOULD be notified when the objected is deallocated. **


** Don't piggy-back resource management on top of dealloc.

You should typically not manage scarce resources such as file descriptors,network connections,and buffers / caches in a dealloc method.Invocation of dealloc might be delayedor side stepped.


When you add an object to a collection such as an array,dictionary,or set,the collection takes ownership ofit.


** autorelease pools **

Anautorelease pool is an instance of NSAutoreleasePool that “contains” other objects that have receivedan autorelease message;when the autorelease poolis deallocated it sends a release message to eachof those objects. An object can be put into an autorelease pool several times, and receives a release messagefor each time it was put into the pool.


If you send an autorelease message when a poolis not available,Cocoa logs a suitable error message.


Autorelease pools are arranged in a stack,although they are commonly referred to as being "nested." When an object is sent an autorelease message, it is added to the current top mostpoolfor the current thread.


If you spawn a secondary thread,you must create your own autorelease poolas soon as the thread begins executing.


Anyautoreleasedobjectscreatedduringthelifetimeofthetaskarenotdisposedofuntilthetaskcompletes.


Ifyoureleaseanautoreleasepoolthatisnotthetopofthestack,thiscausesall(unreleased)autorelease poolsaboveitonthestacktobereleased,alongwithalltheirobjects.


in garbaged collected environment

- release: no-op

- drain of autorelease pool: triggers garbage collection (if the last collection > current threshold)


Youshouldusedrainratherthanreleaseto disposeofanautoreleasepool.


NSCopyObjectcreatesanexactshallowcopyofanobjectbycopyinginstancevariablevaluesbutnotthedatatheypointto.


Zonesarepage-alignedareasofmemorythatholdtheobjectsanddataallocatedbyanapplication. Thesystemassignseachapplicationa“default”zoneinitiallyandapplicationscancreateadditionalzoneslater.



출처 : http://blog.naver.com/PostView.nhn?blogId=gonagi&logNo=150052475165

'아이폰어플개발정보' 카테고리의 다른 글

iPhone Games Projects  (0) 2010.06.24
The iPhone Developers Cookbook  (0) 2010.06.24
Memory Management Programming Guide for Cocoa  (0) 2010.06.24
iPhone Development Guide  (0) 2010.06.24
Quartz2D로 한글 그리기..  (0) 2010.06.24
맥에서 화면 녹화하기  (0) 2010.06.24
Posted by 오늘마감